2 minute read

1.2 Evaluation methodology

The ICPE was conducted according to the approved IEO process and methodology (see full evaluation framework in Annex 2) and adhered to United Nations Evaluation Group norms and standards.2 At the start of the evaluation, the team (with the UNDP country office) identified and validated the list of all the projects within the programme cycle which would form the basis for the analysis (complete project list in Annex 3). This was followed by a desk review of reference material, including country programme strategies, project and programme documents, monitoring reports, audits and evaluations (see Annex 4 for full list), and a stakeholder analysis. The evaluation sought balanced representation of different types of actors involved in the programme, including government officials, project implementing partners, beneficiary groups, United Nations agencies and other development partners from donors, civil society, private sector and academia.

The evaluation employed a rating system to assess the performance of the country programme using the five internationally agreed evaluation criteria of relevance, coherence, effectiveness, efficiency and sustainability (OECD-DAC, 2020). The relevance and coherence of the programme was assessed in relation to key national development policies and strategies, changes in the national context and the interventions of other international actors in the country. UNDP capacity to adapt to the changing context and respond to national development needs and priorities was also considered. The effectiveness of the UNDP country programme was analysed through an assessment of progress towards expected outputs, and the extent to which these outputs contributed to the intended CPD outcomes. To better understand UNDP performance and the efficiency and sustainability of results in the country, the ICPE examined the specific factors that have influenced the programme, positively and negatively.

Due to the outbreak of the COVID-19 pandemic, and in consultation with the country office, the ICPE was conducted remotely. To offset some of the challenges of remote evaluations, IEO hired a team of national consultants to support the data collection and analysis process.

The evaluation relied on information collected from different sources, triangulated before the final analysis. These included:

• A review of UNDP documentation on projects and programmes in Bolivia, decentralized evaluations, research papers and other available country-related publications.3 • An in-depth evaluation questionnaire to collect evidence on reported results, completed by the

UNDP country office and further discussed and validated during interviews. • An analysis of the programme outcomes and theories of change (ToCs), and mapping of implemented projects against the goals set in the CPD. • A total of 80 semi-structured virtual interviews with stakeholders (see Annex 5 for a complete list), to collect data, assess perceptions on the scope and effectiveness of programme interventions, determine factors affecting performance and identify the strengths and weaknesses of the UNDP programme. In this process, IEO ensured that the evaluation team and stakeholders were not put in harm’s way by the pandemic.

2 See website of the United Nations Evaluation Group: http://www.unevaluation.org/document/detail/1914. 3 Eight decentralized evaluations were conducted in the period under review. The quality assessment of the reports was: one satisfactory, one moderately satisfactory, three moderately unsatisfactory, one unsatisfactory and two pending assessment.

Evaluation reports are available at https://erc.undp.org/evaluation/plans/detail/1460.

This article is from: