An Introduction to Performance Monitoring & Evaluation

Page 1

An Introduction to Performance Monitoring & Evaluation (M&E) Fundamentals and ‘How to Guide’ Presented By: Ginelle Greene-Dewasmes, Monitoring & Evaluation Specialist


Module Areas

• Overview • Baseline Assessment • M&E Framework Overview (Logframe Analysis) • Results Based Management: Results Chains • Indicators • Risks, Assumptions and Mitigation • Linking M&E to Project Steering (The M&E Feedback Loop)


Overview Definition: Monitoring: â–Ş A continuous function that uses systematic collection of data on specified indicators to provide management and the main stakeholders of an ongoing intervention with indications of the extent of progress and achievement of objectives and progress in the use of allocated funds.


Overview Definition: Evaluation â–Ş The systematic and objective assessment of an on-going or completed project, program or policy, its design, implementation and results. â–Ş The aim is to determine the relevance and fulfillment of objectives, efficiency, effectiveness, impact and sustainability. â–Ş An evaluation should provide information that is credible and useful, enabling the incorporation of lessons learned into the decision-making process.


Overview Monitoring vs. Evaluation ■ Monitoring aims to identify problems which occur periodically during the implementation of an intervention in order to take proper corrective measures if necessary. It helps one to ascertain whether the on-going activities will give rise to the expected outputs and outcomes (i.e. utilisation of outputs). ■ Evaluation focuses on the realisation of an intervention‘s objectives and impacts (expected or unexpected). It implies an in-depth analysis of all key and pertinent factors related to the implementation of the intervention and the linkages between these factors. ■ In a given M&E system, these two functions should be performed in a complementary manner.


Baseline Assessment A baseline assessment is a crucial element in project planning, conducting research and in any monitoring and evaluation (M&E) framework. • A baseline assessment provides information on the situation an intervention aims to change (the underlying theory of change).

• It provides a critical reference point for assessing and benchmarking changes and long term impact, as it establishes a basis for comparing the situation before and after an intervention, and for making inferences as to the effectiveness of the intervention. • Baseline assessments should be conducted before the actual intervention starts so as to serve as a benchmark for examining what change is triggered by the intervention.


Baseline Assessment Example Excerpt Indicator

Baseline (2017)

Increased availability of relevant skills and tools for at least 15 CSOs/NGOs to design and implement advocacy campaigns relevant to their mandate.

Weak or limited ability to design and implement effective awareness and outreach campaigns that engage key stakeholders in dialogue towards change.

Current value As at 2017 Low

Targets (2021) At the end of 3 years CSOs across 5 thematic areas have been strengthened in their skills and overall capacity to engage with and influence key stakeholders through effective advocacy and communication techniques learned.


Monitoring & Evaluation Framework Overview Start with a Plan: An M&E plan outlines in a structured manner all the necessary processes, steps, methods in the system employed (e.g. data collection to reporting). This includes monitoring tools to be utilised, the key staff members who shall be employing these instruments, timelines and reporting channels to management and how findings are realised into actions for reform if necessary by the management team (in budget decisions, strategy, accountability, learning processes etc.).


Monitoring & Evaluation Framework Overview M&E Plan Example Excerpt Indicator Details

Intervention Source of activities Verification undertaken by (SOV) Project

Proposed Data Collection Method(s)

Responsible Period of Persons/ Monitoring Agencies

Obj. Increased CARIFORUM Exports into the EU market % of CARIFORUM beneficiaries who confirm improved export capacity ISO through increased Certification ability to meet Training international standards (baseline: 0% in 2012, target: 75% in 2017)

20 beneficiary Survey- open and Hired Expert manufacturing closed ended and Project companies questions Staff

Aug-Sep 2016


Monitoring & Evaluation Framework Overview We then begin to build our M&E Framework. Standard Documents Needed to be Prepared: ▪ A clear outline of the system and its information requirements- describing the Project’s guiding conceptual and methodological framework for M&E activities and its tools; ▪ Clearly defined and measurable Project Indicators- which include quantitative or qualitative units of measurements to gauge success along with set timelines for achievement; ▪ A detailed Stakeholder Assessment- Listing and Map of assigned and foreseen roles; ▪ A stakeholder contact database- with specific coding according to type of stakeholder e.g. public sector, private sector, civil society etc.;


Monitoring & Evaluation Framework Overview ▪

An overarching results matrix or model- graphical or tabular in format, depicting entire Project’s goals and expected outcomes from project activities;

Project Logical Framework Matrix- showing expected results and how they will be measured (e.g. Identification of Objective Verifiable Indicators (OVI’s), Sources of Verification (SOV), Important Assumptions and Expected Risks- Mitigation measures may also be included); and

Project Indicator Review- benchmarks in a quantitative manner contribution made to each Project indicator on an annual basis e.g. changes in % value.


Monitoring & Evaluation Framework Overview Logic Framework Example Excerpt Intervention logic (results chain)

Indirect Impact(s)

Objective Verifiable Indicator (OVI)

Source of Important Verification Assumptions (SOV)

Measurable and Who/ what will verify this time bound change

Direct Impact(s) Outcomes(s) Output(s) Main Activities

Resources / costs:

Timeframe:

Conditions needed for success

Potential Risks

Mitigation Measure(s)

Obstacles

To minimise risks


Results Based Management: Results Chains Results-based management is a management strategy by which all actors on the ground, contribute directly or indirectly to achieving a set of development results. These results are broken down into; outputs, outcomes and impacts (direct and indirect) via what is called a ‘result chain’. Results in turn contribute to the institution’s overarching programme goals (United Nations Development Group, 2010).


Results Based Management: Results Chains Indirect impacts … on society, economy, or other aspects

… as defined in government policies

… or results (products and services)

… staff, equipment, material etc.

The results chain

Direct impacts

Outcomes

Outputs

Inputs

Financial resources provided by Parliament, RECs, others

Funds


Results Based Management: Results Chains GIZ Results Chains (generic)


Results Based Management: Results Chains Critical aspects to note: ▪ There is rarely a single cause-effect relationship between one level of causality and the next, ▪ The longer the term, or the higher the level, the less likely that cause-effect relationships can be easily established, ▪ Secondary and possibly undesired effects should be taken into account,

▪ Cause-effect linkages are not completely under the control of the implementing entity, especially if a wide variety of complex factors are involved, ▪ The degree of control can be absolute, strong, medium or weak.


Indicators ▪ Indicators are variables, the values of which should be objectively verifiable, allowing us to measure the degree to which an element of the intervention logic (objectives etc.) has been achieved.

▪ Indicators reflect a common understanding of the project planners and their partners with regard to how they want to measure and assess a project’s progress.


Indicators Specific Objective: ▪ Alumni of M&E training utilize knowledge gained to introduce performance monitoring measures for all main activities executed. Quantitative Indicators: ▪ At least 60% of training participants are able to provide a completed logical framework reflecting 100% of their activities, approximately six (6) months after course completion (Source: project documentation). Qualitative Indicators: ▪ At least 70% of M&E training participants confirm that they have increased skills and knowledge on how to introduce and perform M&E practices in their own organization (Source: Sample surveys, interviews).


Indicators Good Indicators are: SMART = ■ Specific ■ Measurable ■ Achievable ■ Relevant ■ Time constrained => Good Indicators are “Smart ones“!


Indicators

A practical example What?

Who?

Where?

How?

% trainees in the CSOs of CARIFORUM states applying a standard checklist to self-assess their level of performance monitoring functionality (baseline: 0 in 2017, target: 75% in 2019). How many?

When?


Risks, Assumptions and Mitigation ▪ No one can guarantee the success of a project. There are always risks than might endanger implementation of a project as planned. ▪ Risks differ according to: • Their probability, • Direct in nature (delays, higher costs, lower quality etc.) • Indirect in nature (loss of jobs, embarrassment, social unrest etc.)

▪ The sooner a risk factor is recognized and assessed, the easier it is to limit or mitigate the eventual damage it causes.


Risks, Assumptions and Mitigation Some typical sources of project risk: ▪ Availability of adequately qualified personnel (for project management and other tasks), ▪ Timely delivery of material and services, ▪ Partners‘ willingness to cooperate, ▪ Transparency in the start situation (baseline),Transparency in the procedures, ▪ Interventions by third parties.


Risks, Assumptions and Mitigation Assumptions: ▪ A brief assessment of those internal and external conditions that assumed to be in place and that are necessary for the successful execution of a project. ▪ Some assumptions are critical, while others are marginal. ▪ An assumption describes the ideal situation e.g. “economic and political stability”, “secured access to necessary financial support”.


Linking M&E to Project Steering Conceptualising an intervention: ▪ Know how to justify an intervention activity to funders and ensure linkage to your own organisational goals; ▪

Clearly identify Main and Specific Objectives: always link back to your mandate and that of your funding partners;

Clearly highlight the foreseen results (i.e. outputs, outcomes and impacts) for the target beneficiary group(s); and

Be clear in your proposed methodology, that is, how you exactly plan to achieve the results you claim (is this based on international BP or on target group feedback?).


Linking M&E to Project Steering Concept Note (template) Basic structure (not more than 2 pages): 1. Rationale and Justification (the main problem which your project seeks to address, include findings of needs assessments and status of the project sector), 2. Project Description (including objectives and key activities),

3. Target Audience (primary and secondary), 4. Expected Outcomes (immediate and long term), 5. Risks and Assumptions.


Monitoring & Evaluation (M&E) in practice Presented By: Ginelle Greene-Dewasmes, Monitoring & Evaluation Specialist


Module Areas

• Overview • Performance Monitoring Stage 1- Planning and Design

• Performance Monitoring Stage 2- Data Collection, Compilation and Pre-analysis

• Performance Monitoring Stage 3- Data Analysis and Reporting

• Performance Monitoring Stage 4- Long Term Evaluation and Impact Assessment


Overview The key stages of an active M&E system include the following activities broken down into 4 main stages: 1. Data Gathering- baseline assessment, design of tools (ensure qualitative and quantitative capture), primary and secondary data sources. 2. Data Analysis- actual values provided at agreed intervals in charts, graphs etc. 3. Data Reporting- assessment of achievement of objectives and indicators measured. 4. Project Evaluation- involves long term impact assessment usually done by external party. While the last stage is usually done by an external party, the first three are conducted by internal staff on an ongoing basis and critically feed into the fourth stage of Project evaluation.


Overview Example Monitoring, Data Gathering Toolkit (GIZ example)

Planning Stage

Concept note

Immediately Post Activity

Long Term Follow-up (6mths to 1 year)

Post event evaluation form(s)

Impact Assessment form

Back to Office Report (BTOR)/ Summary Results Chain

Interview/ Focus Group Questions Sheet

Project & Mission Brief Standardized budget and participant registers


Performance Monitoring- Stage 1 Assign a responsible person to manage M&E. This person can employ free easy to use online cloud system tools for data gathering, analysis and reporting such as ‘google forms’ and ‘survey monkey’.

Monitoring Stage 1 (Planning Steps leading up to execution of Intervention activity) STEP 1.1:

Upon approval by Manager for an activity an ‘Intervention/Activity Brief’ should be prepared by the responsible technical officer. This outlines the key Project objective and indicators addressed by the intervention activity and expected results chain.

STEP 1.2:

Design of data collection instruments- surveys, interviews, focus groups, observation etc. This must be based on the specific objective and indicators the activity is trying to achieve.

STEP 1.3:

Dissemination of a standardised online registration form via email with ‘google docs’ link or in person hard copy (baseline)Responses are seen automatically by administrator (If done via hardcopy the M&E admin would need to key in the results manually after collecting hardcopy versions printed from google drive);

STEP 1.4:

Post Event Data Collection Forms prepared and shared at the end of each intervention (in person for maximum results)Mode 1- Send email invitation to complete online evaluation form via link; or Mode 2- print evaluation forms from google drive and disseminate hardcopies at the end of workshop/intervention activity to be filled by (i) beneficiaries and (ii) technical specialist(s)/ trainers (feedback should include questions that directly measure contribution of activity to relevant Project Indicators and goals).


Performance Monitoring Stage 2- Data Collection, Compilation and Pre-analysis Monitoring Stage 2 (Data Collection and Compilation) STEP 2.1

Data management- Ensure that end of activity/event/periodical evaluation forms are filled and collected in a timely manner. If this is not entered online by participants directly; the responsible technical officer needs to manually key in feedback into ‘google drive’, which will the generate the graphs automatically (should be done using separate forms for different types of stakeholders as necessary);

STEP 2.2

Link Feedback to Indicators and Objs.- The quantitative information provided in data collected is compiled according to highlight relevant contribution to key indicators in result matrix format (outputs, outcomes, expected direct and indirect impacts).


Performance Monitoring Stage 3- Data Analysis and Reporting Monitoring Stage 3 (Analysis & Reporting of Activities’ Immediate Outcomes) STEP Periodically, data compiled is analysed (e.g. graphs/charts generated in 3.1: google drive, beneficiary statements) and formal reports done: • Executive Summary; • Major Findings- relevant feedback linked to Project Indicators; • Successes – use respondents’ comments to give examples; • Challenges experienced; • Recommendations and Conclusions- based upon suggestions of respondents and experience from technical team. OR • A simple Overarching Results Chain can be done showing major achievements towards Indicators in tabular format. Then share completed report with main stakeholders i.e. Management, key partner agencies, Ministries, donors etc. Make necessary changes to your programme based on recommendations made.


Performance Monitoring Stage 4- Long Term Evaluation and Impact Assessment Stage 4 (Interim to Long Run Evaluation and Impact Assessment) STEP 4.1:

This stage is important for measuring tangible impacts, justifying money spent or reasons for Project extension. Internal- surveys can be disseminated to gather feedback 6mths to a year after an activity, to determine some direct impacts if any or challenges hindering such; ▪ Beneficiaries should be made aware from the start of Project’s ongoing monitoring for impacts and its importance to ensuring their success (their consent to have their feedback anonymously published, should be included in all surveys conducted); ▪ Impact Assessment surveys can be done via an online form however may lead to a low response rate. Telephone interviews and face to face meetings where 31.07.2019 feedback is inputted manually into google by a Project technical expert may yield greater results, although more time consuming.


Performance Monitoring Stage 4- Long Term Evaluation and Impact Assessment Stage 4 (Interim to Long Run Evaluation and Impact Assessment) STEP 4.1 cont’d:

▪ Less frequently, other data collection tools which can also be used include: • Inviting past beneficiaries to participate in Focus group discussions; and • Face to Face interviews conducted.

External- the Project or relevant donor can fund an external team to conduct an impact evaluation to capture long term results. This is usually done nearing the end of a Project.


Useful Information Sources Useful videos about using the google drive system for Monitoring & Evaluation:

Topic

Weblink

Google Forms: An Introduction

https://www.youtube.com/watch?v=5Nf eJ56gvPQ

Google Drive: An Introduction

https://www.youtube.com/watch?v=HU9Z5gtQVk

Google Drive: Sharing and Collaborating

https://www.youtube.com/watch?v=Dso a9skxVuk 31.07.2019


Thank you for your attention Mrs. Ginelle Greene-Dewasmes Monitoring & Evaluation Specialist Email: Ginelle.greenecmccaribbean@gmail.com Skype ID: ginelle.greene (region- Trinidad & Tobago)


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.