Value for Money (VfM) assessment of Leadership for Change (LFC) programme 1. Background The Leadership for Change (LFC) programme seeks to improve outcomes for women and girls by increasing their influence in legislative and policy making processes and by improving their ability to access and benefit from services. DFID is providing up to £4.5m over the period 2012-2015. The impact of the programme is expected to be “improved service delivery and service uptake by women and girls”. The outcome is “increased influence of women and girls in formal and informal political decision making”. The outcome and impact are expected to be delivered through four outputs: 1. Increased number of women leaders supported through targeted interventions to increase capacity, connections and leadership skills. Implementation is through the Vital Voices Lead Fellowship (VV) Programme. 2. Adolescent girls and young women provided with the means to develop their leadership skills and challenge gender norms. Implementation is through the Women Win Building Young Women’s Leadership through sport (WW) programme. 3. Increase the evidence base on approaches for sustainable long term attitudinal and behaviour change towards gender equality among boys and men. The contract for delivering this output was awarded to a consortium led by the Institute of Development Studies (IDS). 4. Increase the evidence base on approaches to increase girls’ and women’s voice and leadership in decision making processes at all levels, including in formal political processes and fora. The contract for delivery of this output was awarded to the Overseas Development Institute (ODI). A Value for Money (VfM) ratings assessment was carried out at the Business Case stage. This drew heavily from an approach outlined in a 2010 report on measuring VfM of governance programmes.1 Options were rated against a number of qualitative criteria, intended to capture the three different VfM dimensions: economy, efficiency and effectiveness (explained in more detail below). However, whilst the Business Case made some effort to demonstrate VfM of the alternative options, no specific VfM measures were suggested that would help track the VfM of the programme over time. An Annual Review of the programme was carried out in January 2014. The reviewers were not able to complete a VfM assessment as part of the review, and instead recommended that a VfM assessment should be completed by July 2014. This note addresses that recommendation. The period covered by this VfM assessment is the same as that covered by the Annual Review: October 2012 to December 2013. 2. Overview of DFID’s approach to VfM Achieving value for money means getting the greatest impact on poor people’s lives from each pound that DFID spends. When thinking about value for money, we tend to talk about the three “Es”: economy, efficiency and effectiveness. These describe the links between inputs, outputs and outcomes: something is economic if it is being bought for the lowest possible cost (inputs); something is efficient if it is being delivered for the lowest possible cost (outputs); something is effective it is being achieved for the lowest possible cost (outcomes). Cost effectiveness is a catch-all term that describes the links from inputs to outcomes. 2
1
Barnett, C. et al “Measuring the Impact and Value for Money of Governance and Conflict Programmes” (Dec 2010) 2 More detail is provided in “DFID’s approach to Value for Money” (July 2011)
In order to quantitatively assess VfM, we need to find ways of measuring these different concepts. One way of doing this is through unit costs. However, given the description of the 3Es above, it is clear we need to be examining unit costs on several different levels. In the context of the LFC programme, we could look at unit costs of inputs (e.g. cost of training materials) in order to understand the economy of the programme; we could look at unit costs of outputs (e.g. cost per participant in the VV programme) in order to understand the efficiency of the programme; and we could look at unit costs of outcomes (e.g. cost per indirect beneficiary) to understand the effectiveness of the programme. However, it tends to be more difficult to estimate unit costs the further one moves through the results chain. Whilst unit costs for inputs may be relatively straightforward to define, unit costs per outcome are much harder to define, particularly when, as in the LFC programme, there are challenges in defining and measuring the outcome indicators themselves.3 Whilst unit costs are important there are also other things that we can measure in order to capture the VfM of the project. Results are clearly an important part of the VfM agenda. Therefore a measure of the overall success of the project (e.g. the Annual Review score) could be a further VfM metric in assessing effectiveness. We can also look at the extent to which procurement is being actively used as a way to manage costs and drive value for money. In addition, it is also useful to look at the variance between budgeted and actual spend under the project: in order to achieve VfM, we want to ensure that this variance is as little as possible: money which remains unspent at the end of the month might have been better allocated to another purpose. 3. Methodology for assessing VfM of the LFC programme Since no VfM indicators were proposed at Business Case stage, a key first step is to identify potential metrics for assessing VfM of the programme now and going forward. Table 1 below suggests some indicators that could be used to assess VfM of the programme, based on output and outcome indicators in the log frame and data that is currently available, or likely to be relatively easy to obtain from project implementers. These indicators only cover outputs 1 and 2 of the programme. According to the Annual Review, performance indicators for outputs 3 and 4 will be agreed during the inception period in the first quarter of 2014. Table 1: Proposed indicators for assessing VfM of the LFC programme
Economy
VfM Dimension
3
Indicator
Comments
Admin/overhead cost per £1 of programme spend under each project
The Business Case stated that overhead costs per £1 of programme spend were expected to be £0.12 for the VV programme and £0.19 for the WW programme. For outputs 1 & 2, this indicator can be monitored from programme expenditure statements. Data is not currently available for outputs 3 & 4
The recent Annual Review noted that three out of four of the outcome indicators for the LFC programme presented reporting challenges. Collecting accurate data was highlighted as a challenge, as was independent validation of data collected. The Annual Review recommended that new outcome indicators be agreed, along with a revised M&E plan.
Cost for delivery of online platform (under VV programme)
The budget for delivering the online portal is £33,109.
Cost per Digital Storytelling (DST) or Audio Storytelling (AST) workshop (under WW programme)
Not terribly useful as an indicator. Difficult to benchmark against other projects since the cost of an online platform will depend on the functions it needs to perform. Since this is a one-off activity (and cost can only be calculated once the portal is fully functional), there is also no scope for tracking VfM over time. To monitor performance against this indicator you would need to know the number of training courses developed, as well as the cost of developing these. This information does not appear to be readily available through the logframe, the budget or quarterly expenditure statements. However, there seems no reason why this information could not be provided by the project implementers if this indicator was of interest to DFID. With additional information on number of people trained, a further efficiency indicator (“Cost per person trained”) could also be calculated. This latter indicator may be easier to benchmark against costs from other programmes with a training component. The cost of the DST component is €72,893 according to the overall budget (although additional costs of ‘DST capacity building’ are recorded under the ‘Building Capacity’ component). The number of workshops is not captured in the logframe but should be easy to report and verify.
Cost per beneficiary of the VV programme Cost per beneficiary of the WW programme Cost per South to South (S2S) exchange or cost per participant in S2S exchange (under VV programme)
In terms of calculating performance against these metrics over time, the quarterly expenditure statements provide some information on ‘DST capacity building’ costs but it is not clear whether these relate to workshop costs and/or if they capture all costs associated with running the workshops. Without greater clarity it is difficult to calculate performance against these metrics. No unit cost information was provided in the business case. However, it is possible to track performance under these metrics, by taking information on number of beneficiaries from output indicator 1.1 and 2.1 of the logframe and comparing this with total project expenditure to date. The S2S component of the budget is anticipated as being £337,438. This is intended to cover two S2S exchange forums, implying a cost per exchange of £168,719. Forty five participants are expected to take part in each exchange, giving a cost per participant of £1875.
Efficiency
Cost per training course developed (under VV programme)
Cost per Peer to Peer exchange or cost per person participating in peer to peer exchange (under VV programme)
This would seem a difficult cost to benchmark against other programmes given the unique characteristics of the project. However, it could be interesting to compare ex-post costs with those expected ex-ante. To monitor performance against this metric you would need to know the number of S2S exchanges that had taken place, how many participants were involved and costs associated with delivering this particular component of the programme. The number of exchanges or participants is not captured in the logframe but would be easy to report and verify. However, costs for different programme components are not currently easy to identify from quarterly financial statements, which combine costs from different elements of the programme in one budget line e.g ‘consultant travel’. The format of quarterly finance statements would need amending to allow this information to be captured. The P2P component of the budget is anticipated as being £173,493. This is intended to cover four P2P mentoring trips in each of the first two years and three in the last year, implying a cost per exchange of £15,722. Three participants were expected to take part in each exchange, giving a cost per participant of £478. As above, this would seem a difficult cost to benchmark against other programmes but it could be interesting to compare ex-post costs with ex-ante costs. To monitor performance against this metric you would need to know the number of P2P exchanges that had taken place, how many participants were involved and costs associated with delivering this particular component of the programme. As above, the number of exchanges or participants is not captured in the logframe but would be easy to report and verify, but the format of quarterly finance statements would need amending to allow cost information to be captured.
Cost per fellow for online platform development and delivery of online training (under VV programme)
Cost per mentoring workshop or cost per participant in mentoring programme (under WW programme)
Effectiveness
Unit cost per girl trained in DST/AST (under WW programme)
Cost per research output under outputs 3 & 4 Potential VfM indicators may include cost per indirect beneficiary (i.e. per person benefitting from services provided by participants in the VV programme); or cost per beneficiary who subsequently engages in policy change.
To monitor performance against this indicator you would need to know the number of people making use of the online platform, as well as the cost of developing the platform and online training courses. The budget for delivering the online portal is £33,109, although it is not clear whether this includes the cost of delivering online training or just setting up the platform. This would need to be clarified and costs against this item clearly presented in the quarterly finance statements. The number of users is not currently captured in the logframe but should be relatively easy to report. To monitor performance against this indicator you would need to know the number of mentoring workshops or number of participants, as well as the cost of running these. The cost of the mentoring component is €117,279 according to the overall budget. This can be combined with information from output indicator 2.4 in the logframe (i.e. a cumulative total of 48 participants are targeted) to give an expected cost per participant of €2,443. In terms of monitoring performance against this indicator, we know that there were 11 participants in the mentoring programme (against a target of 8). However, the only costs associated with ‘mentoring’ that appear in the quarterly expenditure statements are €3,869 in Jan-Mar 2013. There must have been other costs associated with this activity but these are no clearly set out in the expenditure statements. Without clearer reporting of costs against this programme component it is difficult to calculate performance against these metrics. The cost of the DST component is €72,893 according to the overall budget (although additional costs of ‘DST capacity building’ are recorded under the ‘Building Capacity’ component). Data on number of participants in DST workshops is captured in the logframe: the programme has a cumulative target of 120 implying an expected cost of €607 per girl trained. In terms of calculating performance against these metrics over time, the quarterly expenditure statements provide some information on ‘DST capacity building’ costs but it is not clear if they capture all costs associated with running the workshops. Without greater clarity it is difficult to calculate performance against these metrics. Data is not currently available for outputs 3 & 4 Not possible to assess at this stage, given lack of agreed outcome indicators. The most recent Annual Review (Jan 2014) recommended a review of outcome indicators 1.1, 1.2, 1.3 and 1.5 given challenges around data collection. VfM effectiveness indicators will depend on decisions made as part of this review.
The lead implementers under outputs 1 and 2 have provided a detailed budget breakdown including some unit cost information for the programme period, as well as expenditure statements covering the period October 2012 to December 2013. Since activities under outputs 3 and 4 are only just beginning, no expenditure information is available. Beyond unit cost measures, other information that could be used to assess overall VfM are the Annual Review score (as a measure of overall project success), evidence of strength of procurement systems and whether these are being used to managed costs, and the variance between budgeted and actual spend under the project. 4. Assessment of VfM of the programme
Unit costs Over the period of the VfM assessment the VV programme applied a fixed percentage (of 11.9%) to programme costs to cover ‘general and admin’ costs. This is consistent with the figure presented in the Business Case (i.e. £0.12 of admin/overhead cost per £ of programme spend). However, some ‘other direct costs’ such as ‘telephone and internet’, ‘equipment rental’ and ‘books, periodicals, subscriptions’ appear separately in the detailed budget breakdowns and could arguably also be classed as admin costs (and in fact did appear to have been, in an early proposal). Classifying these ‘other direct costs’ as admin/overhead costs brings the proportion of admin costs to programme costs to around 14% on average (i.e. £0.14 per £ of programme spend). The Business Case used WomanKind, a similar NGO working in the sector, as a benchmark for admin costs. WomanKind was indicated as spending £0.21 on overhead costs per £ of programme spend, suggesting that the VV programmes continues to offer VfM when making comparisons against this particular indicator. The DFID team also did some benchmarking against Global Poverty Action Fund (GPAF) grantees. This exercise looked at admin costs as a percentage of total commitments for 32 grantees. Admin costs ranged from 0% to 26% of the total costs, but averaged around 11% (i.e. £0.11 per £ of programme spend). Admin costs under the VV programme look comparable, if marginally higher. However, without knowing what each GPAF grantee included in its definition of ‘admin’ costs, it is difficult to know whether we are comparing projects on a consistent basis. The WW programme also identified a separate ‘admin and general’ budget line as part of the detailed expenditure statements. These admin/overhead costs ranged from between 3% and 19% of total programme spend, depending on the quarter. On average ‘admin and general’ costs comprised around 8% of the total. This appears to offer good VfM when compared to the VV programme, the benchmark WomanKind programme and GPAF grantees. However, when looking at the detailed expenditure statements for the WW programme, some staff costs had also been allocated across individual project outputs (on top of programme staff costs). It is not clear whether these additional staff costs should be classified as programme costs or admin and general costs. We would recommend agreeing a clear definition with the project implementers on what should be included within the definition of admin/overhead costs. To measure the efficiency of the programme it is useful to consider costs per beneficiary. The Annual Review found that 95 participants in the VV programme were engaged in at least one network activity, compared to a target of 60 (output indicator 1.1 of the log frame). This implies a cost per beneficiary of $6,933 (~£4,116) in the first year of the programme, compared with an expected cost of nearly $11,000 (~£6,530). The Annual Review found that 12,490 adolescent girls and young women were participating in sports programmes as a result of the first year of the WW programme (output indicator 2.1 of the log frame). This implies a cost per beneficiary of €77 (~£62) in the first year of the programme, compared with an expected cost of around €96 (~£78) per beneficiary. It is not possible to compare unit costs per beneficiary between these two programmes, nor compare them to similar NGOs, since the nature of the programmes are quite different. However, it will be useful to track performance within these programmes over time. We can already see that unit costs per beneficiary appear lower under both programmes than at the start of the programme as a result of better-than-expected results against these particular indicators.
Project performance The most recent annual review in January 2014 gave the project an overall output score of A+ (outputs moderately exceeded expectations). The review concluded that both VV and WW made good progress in year one and all indicators were either met or exceeded, with many substantially exceeded. VfM in procurement Vital Voices and Women Win are the implementers of outputs 1 and 2 respectively. The contractual arrangement with DFID is in the form of an Accountable Grant (AG). The Business Case notes that alternative providers were not considered and that the procurement process is not being actively used as a way to manage costs. It is possible that better value for money would have been achieved through the use of tendering. Tendering can help ensure value for money through competition amongst alternative providers. However, the Business Case justifies use of an AG given the fact that both organisations meet the AG criteria and have a strong track-record of implementing similar interventions. A desk-based review at the time of the Business Case found that “both organisations have in place reasonable policies and procedures for managing their finances and procurement� and stated the intention to undertake full due diligence before accountable grants were signed. We understand that due diligence assessments were conducted and the few recommendations made have since been addressed. Contracts under outputs 3 and 4 have been awarded following two separate competitive bidding processes. Budgeting and forecasting Looking at the quarterly expenditure statements under the VV and WW programmes, it was possible to assess to what extent forecasted expenditure for the next quarter matched with actual expenditure. The WW programme performed relatively well against this metric. Variance between forecasts and actual expenditure ranged from +5% to -9% under the WW programme. Variance was greater under the VV programme, ranging from -9% to -32%, depending on the quarter. As outlined above, we want to ensure that variance between forecast, and actual spend is as small as possible since money which remains unspent at the end of the month might have been better allocated to another purpose. 5. Recommendations for the LFC programme On the basis of the information available, our assessment is that the programme appears to offer Value for Money. The available unit cost information suggests this programme performs well compared to initial expectations and available benchmarks. Overall project performance (in terms of delivery against log frame outputs) is strong and systems and procedures within implementing bodies appear designed to drive VfM in procurement. However, this initial VfM assessment has identified the following recommendations: -
-
Consider adding additional indicators to measure the economy and efficiency of specific components of the programme. Suitable candidates are suggested in table 1 and include cost per training course/workshop and cost per participant in training courses/workshops under the different programmes. These indicators would need to be clearly defined and agreed with the programme implementers. Data should be routinely provided by VV and WW in order to track progress against a wider range of unit cost measures. The format of data should be agreed and remain consistent so that
-
-
VfM performance can be easily tracked over time. In particular, expenditure statements need to clearly link costs with key activities of interest (e.g. particular workshops or training activities) in order to allow calculation of unit cost measures. Other (non-cost) data may also be required e.g. number of (specific types of) workshops, number of workshop participants, number of online platform users etc. where this data isn’t routinely captured as part of the current M&E arrangements. Programmes should clearly identify what costs are included under the ‘admin & general’ category and which are classified as programme costs. Treatment should be consistent across programmes Effectiveness indicators should be developed alongside/following the review of outcome indicators (scheduled to have taken place in April 2014). Both the VV and WW projects, but particularly the VV project, could improve the accuracy of their forecasts; Further indicators should be developed to track the VfM of outputs 3 and 4, once the performance indicators have been agreed (scheduled to have taken place in the first quarter of 2014).
6. Documents referenced LFC year 1 annual review LFC Log frame LFC Business case Women Win project budget (3 years) Vital Voices project budget (3 years) Expenditure statements for VV covering the period (Oct 2012 to Dec 2013) Expenditure statements for WW covering the period (Oct 2012 to Dec 2013) VV note on categorising indirect (admin) costs Sarah Love Economic Adviser June 2014