AWEPA Handbook on Programme Design, Monitoring and Evaluation

Page 1

AWEPA Handbook

on Programme Design, Monitoring and Evaluation

www.awepa.org



AWEPA Handbook on Programme Design, Monitoring and Evaluation

A handbook containing practical advice and tools to guide the design of AWEPA programmes and the monitoring and evaluation of results

AWEPA works in cooperation with African Parliaments to strengthen parliamentary democracy in Africa, to keep Africa high on the political agenda in Europe and to facilitate African-European Parliamentary dialogue.


Production Notes Text Nicolaas van der Wilk, AWEPA M&E officer

Photos AWEPA Staff, UN Photo Archive

Design Anastasia-Areti Gavrili and Emanuela Falzon Campbell

AWEPA International Prins Hendrikkade 48-G 1012 AC Amsterdam, The Netherlands Tel +31 20 5245678 Fax +31 20 6220130 amsterdam@awepa.org www.awepa.org

Š AWEPA 2016 The AWEPA Handbook on Programme Design, Monitoring and Evaluation was made possible by:

Cover Photo Agents of change unite against Female Genital Mutilation/Cutting in Kenya. Copyright/AWEPA.


Contents About this handbook.................................................................................................................................... 6 Programme design and M&E in AWEPA....................................................................................................... 7 AWEPA principles for programme design and M&E framework................................................................ 8 Programme design........................................................................................................................................ 9 Step 1: Carrying out a political context analysis..................................................................................................... 9 Step 2: Doing a needs-assessment

................................................................................................................... 10

Step 3: Formulating objectives and developing the intervention logic.............................................................. 12 Step 4: Developing a Theory of Change.................................................................................................................. 14 Step 5: Selecting progress indicators..................................................................................................................... 15 Step 6: Budgeting..................................................................................................................................................... 15

Monitoring and Evaluation.......................................................................................................................... 16 Step 1: Developing a Monitoring and Evaluation (M&E) plan............................................................................. 16 Step 2: Carrying out a Baseline study.................................................................................................................... 17 Step 3: Monitoring activity results on output and intermediate outcome level............................................... 18 Step 4: Monitoring intermediate outcomes and outcomes................................................................................ 20 Step 5: Monitoring programme impact on the society........................................................................................ 21 Step 6: Assessing results using complexity-aware methods............................................................................... 21

Programme Evaluation................................................................................................................................. 22 Step 1: Establishing a reference group.................................................................................................................. 22 Step 2: Determining the evaluation purpose........................................................................................................ 22 Step 3: Designing the evaluation........................................................................................................................... 23 Step 4: Collecting the data...................................................................................................................................... 23 Step 5: Participatory reviewing of findings........................................................................................................... 23 Step 6: Using the evaluation results....................................................................................................................... 23

Annex 1: Further reading.............................................................................................................................. 25 Annex 2: List of M&E Tools............................................................................................................................ 26 Annex 3: Guide to Key informant interviews.............................................................................................. 26 Annex 4: Guide to Focus Group Discussion................................................................................................. 27 Annex 5: Guide to Surveys............................................................................................................................ 29 Annex 6: Guide to Desk Research................................................................................................................. 30

Elections lie at the heart of democracy. AWEPA works in partnership with African parliaments to strengthen democracy in Africa, keep Africa high on the political agenda in Europe, and facilitate African-European parliamentary dialogue. Photo / UN Archive


About this handbook This handbook is developed to offer AWEPA staff hands-on guidance when it comes to designing new programmes and measuring and making visible programme results. It summarises best practices and introduces several practical tools. The handbook is based on AWEPA’s new methodological approach to programme design and monitoring and evaluation (M&E), which was updated in 2015. This approach is grounded in thirty years of experience of working with parliaments and parliamentarians in Africa and complemented with new insights and innovations as propagated in recent studies on the effectiveness of parliamentary support. The renewed organisational focus on methodology is made possible by AWEPA’s institutional donors. The document is structured as follows: Firstly, programme design practices and M&E are situated in AWEPA’s general programme cycle; Secondly, insight is given on how AWEPA has adjusted its methodological approach and M&E framework to the highly political contexts in which it operates; Thirdly, guidelines, tools and best practices are presented for programme design; Lastly, the M&E framework is explained and tools and best practices introduced for effective monitoring of results and programme evaluation. All the tools introduced in this handbook can also be found on the AWEPA SharePoint environment. It is to be emphasised that programme design and M&E practices are continuously changing, following new trends and lessons learned from the field. This handbook will therefore be updated periodically.

Page 6

AWEPA Handbook - 2016

www.awepa.org AWEPA in action during the “Peace, Security and Sustainable Development” International Parliamentary Seminar in Brussels, 2015. Photo / AWEPA


Programme design and M&E in AWEPA Sound programme design and the monitoring and evaluation (M&E) of results ensure that AWEPA programmes remain innovative and of high quality.

• Statutes: AWEPA is registered as an association and its overall purpose, geographic scope and membership are enshrined in its statutes.

Programme design involves analysing the challenges to be addressed, formulating objectives, identifying activities and developing a programme theory.

• Strategic Vision: this multi-annual document elaborates on the broader long-term AWEPA objectives and priorities. It sets out targets regarding membership, partnerships, funding and operations. It also contains the thematic or programmatic priorities on which AWEPA intends to focus.

M&E involves the systematic monitoring and collection of data on results being realised, and the assessment of the relevance and effectiveness of the programme and overall impact achieved. Done well, M&E helps you demonstrate to donors and partners that you are doing the right things and doing them well. The lessons learned contribute to designing programmes that are more effective, innovative and high quality. M&E practices and the design of new programmes are inherently linked to AWEPA’s programme cycle (Figure 1). Firstly, the overall Strategy of AWEPAinforms programme design and fundraising. Secondly, the intervention logic drafted during programme design determines activity implementation and the monitoring of results. Lessons learned from programme evaluations feed back into the overall AWEPA Strategy. The overall AWEPA strategy is laid out in a number of strategic documents:

• Theory of Change: the ToC explains the changes to which AWEPA wishes to contribute, the strategies and interventions that are put in place to contribute to these changes, and the assumptions AWEPA makes about why these particular strategies lead to the impact sought. The ToC further explains the vision and mission of AWEPA as well as the specific added value of the AWEPA strategies and interventions. The document primarily defines the kind of work AWEPA does and what it doesn’t do. • Methodology document: this document presents AWEPA’s organisation-wide guidelines, tools and best practices with regards to designing, monitoring and evaluating its programmes. It helps programme staff to uphold the distinctive AWEPA approach in the different stages of the programme cycle and informs on key principles for measuring and making visible results achieved.

Figure 1: The AWEPA programme cycle

AWEPA Handbook - 2016 www.awepa.org

Page 7


AWEPA principles for programme design and M&E framework In its work to strengthen parliamentary democracy in Africa, AWEPA operates in highly political contexts that are actor-based, interest-driven and dynamic. Political systems are shaped by actions and interactions between individual actors (MPs, committees, political parties), who pursue different moral, economic or institutional interests according to particular incentive structures. Political and policy processes are subject to changing internal composition as well as changing external influences. As such, these processes are nonlinear and unpredictable. As a result of these contextual particularities, the design and M&E of AWEPA’s parliamentary support programmes are: • Adapted to different contexts and based on political analysis: tailored programmes and M&E strategies for each partner parliament, adjusted to particular country contexts and the needs of the partners. An analysis of the political-economic context in which the partner operates ensures that the context is reflected in the programme design. • Subject to a participatory exercise with key stakeholders: the parliamentary partners should participate in the design of the programme and of the M&E strategy. It increases their ownership over the programme. Political Coordinators and Senior Parliamentary Advisors (SPA) can play an important role in facilitating these discussions. • Aimed at monitoring and capturing the longer term results of AWEPA activities and focused on contribution rather than attribution. It is not enough to pursue and monitoring activity outputs, e.g. number of trainings and participants. Instead, it is important to collect anecdotes and stories about the change AWEPA is contributing to: changes in the capacities and functioning of MPs, changes in legislation and institutions, and changes in relations between key stakeholders in society. The focus is on registering contribution to these changes, and not attribution, because the implementation of activities cannot be linked

easily to increased parliamentary performance. • Aimed towards adjustable objectives and indicators: following the changing and unpredictable nature of political processes and the dynamics of the electoral cycle, the regular adjustment of programme objectives, intervention strategies and indicators should be rule rather than exception. AWEPA’s M&E framework takes into account these principles and comprises of four result levels: outputs (direct output of activities), intermediate outcomes (direct effects of the activities and intermediary steps towards the outcomes), outcomes (specific objectives of the programme) and impact (overall objective of the programme and impact on society).1 The conceptual model and assumptions underpinning the measuring of results on these levels are explained in chapter 4. The result levels are captured in a logical framework (logframe), a tool which is used for project management, M&E and reporting. The logframe approach is, however, complemented with a “complexity-aware approach”. This includes methods such as a Theory of Change for programme design, the monitoring of results through Stakeholder Feedback and the evaluation of outcomes through the Most Significant Change technique. This dual approach allows for adaptive management of the programme, measuring nonlinear result pathways and making visible outcomes outside those originally planned for. In the remainder of this document, the approach to programme design and the M&E framework will be set out further through the introduction of practical tools and best practices. Detailed manuals and templates for each of the tools (underlined) are available on the AWEPA SharePoint.

1. The terminology varies per donor. The level of intermediate outcomes is often left out. EU: Outputs/Specific objectives/Overall objective. DFID: Result Area / Outcomes / Impact. USAID: Outputs / Sub-Purposes / Purposes / Goal. Sida: Results / Project Goals / Overall Goal. Page 8

AWEPA Handbook - 2016 www.awepa.org


Illustration by Silva Ferretti in ‘Monitoring of Advocacy – When the change process is complex’, a 2014 paper by Fagligt Fokus.

Programme design The design of AWEPA programmes is a participatory process in which the AWEPA staff, together with the parliamentary partner and the Political Coordinator, define programme objectives and interventions. These deliberations may be informed by AWEPA’s Strategic Vision and Theory of Change. Through a political context analysis, stakeholder analysis and needs-assessment, the interventions can be tailored to the needs of the partner and the contextual particularities, and realistic objectives and indicators for success can be defined.

Step 1: Carrying out a political context analysis Due to complex and often volatile parliamentary dynamics, it is difficult to determine in advance whether support strategies will be effective and whether objectives are realistic. Therefore, each AWEPA Handbook - 2016 www.awepa.org

AWEPA programme is to be informed by an analysis of the political-economic context in which the parliamentary partner operates. Such a context analysis is carried out jointly by AWEPA staff, the Political Coordinator and SPA and the parliamentary partner, and should reveal the formal and informal institutions and rules; the stakeholders and power relations between them; and the incentives and interests in and around parliament. Figure 2 illustrates some of the key political-economy factors that should be taken into account. A political context analysis tool is available on SharePoint. The tool proposes analytical categories to be covered and it points to existing sources that can help develop assessment questions. The tool also proposes a stakeholder-analysis framework and includes a template for a context analysis report as well as guidelines on how to prepare and execute the study. Page 9


Figure 2: Political economy factors for parliamentary strengthening Foundational factors (Historical conditions and considerations) • History of state formation • Sectarian, ethnic, geographic, religious structures and their impact in politics • Political history underpinning the current regime and the penetration and acceptance of its authority

Here and Now (current political and economic setup) • Nature of political system, balance of power between executive and legislative branch • Nature, quality and power of political parties - models, linkages, groups • Current and recurrent political issues and discussions with conflicting interests

Step 2: Doing a needs-assessment

Rules of the game (formal and informal institutions underpinning political and parliamentary dynamics • The nature of partonage and political appointments • Resources and economic linkages of key personalities in and around parliaments • Existing accountability mechanisms (including constituency relations • Electoral politics and the balance between short- and longterm perspectives in parliamentary affairs

1. Capacities and functioning of MPs and the parliament as a whole; 2. The parliament’s internal institutionalisation;

In addition to a context analysis, programme design typically starts with an analysis of the needs of the partner parliament. This is done together with the parliament’s leadership to ensure that the assessment is fully owned, in line with the parliament’s own strategic plan and coherent with support provided by other partners. The Political Coordinator and SPA can play an important role in initiating and facilitating such talks. A participatory needs-assessment is used to reveal requirements for capacity strengthening, and technical and political accompaniment, as well as to identify topics for parliamentary dialogue with European peers. The brainstorming focuses around the parliament’s long-term strategic objectives to which AWEPA can contribute (instead of short-term needs). The assessment takes into account the three interrelated dimensions that are identified in AWEPA’s Theory of Change: Page 10

3. Relations between MPs and other stakeholders in society. A detailed categorization is presented in Figure 3 (following page). The AWEPA Parliament Scan translates these categories into concrete questions that can be debated together with the parliamentary stakeholders and other experts. The Parliament Scan can be found on SharePoint. Although the context analysis and needs-assessment are often combined exercises, it is important to acknowledge their differences. Whereas the needsassessment primarily looks at internal factors and focuses on what can be reasonably changed within parliament, the context analysis looks at factors external to parliament that influence its functioning. As such, the former helps identify the types of support interventions that could be put in place, whilst the latter informs the user of assumptions, conditions and risks that might affect interventions leading to sustainable change. AWEPA Handbook - 2016 www.awepa.org


Figure 3: Categories for needs-assessment

Traditional parliamentary functions • Legislative process • Budgetary cycle • Executive oversight and accountability • Representative function

Thematic roles • Sectoral policy dialogue on development • Conflict resolution and peace-building • Regional integration and parliamentary diplomacy

Transversal questions in parliament • Transparency and accessibility of parliament • Accountability of MPs • Interaction with other political actors and with civic actors • Relations between caucusses and political parties • Gender, marginalised groups, inclusiveness

Institutionalisation of parliament • The Committee system, Leadership bodies • Rules of Procedures, Legislative workflow, Code of Conduct • Protection and benefits of MPs • Independence of the Administration • HRM system • Hansard, legal and communications department • Facilities, equipment and ICT

AWEPA Handbook - 2016 www.awepa.org

Page 11


Step 3: Formulating objectives and developing the intervention logic

• Specific: what are you trying to achieve, where and with whom? • Measurable: will you know when you’ve achieved it?

The context analysis and needs-assessment help build the intervention strategy and identify objectives on outcome level that are realistic and fall into AWEPA’s sphere of influence.

• Achievable: can you do it, with the money you have and the people you have, in the time you have?

The smaller incremental steps to be taken to reach the outcomes are referred to as ‘intermediate outcomes’ and also need to be identified beforehand. There should be a logical causal relation between the activities that you plan to put in place, the intermediate results to be achieved and the outcomes to which you intend to contribute.

• Time-bound: is it clear when it will take place?

See Figure 4 (following page) for an example of a typical AWEPA intervention logic. Outcomes relate to improved functioning of the parliament as a whole. Intermediate outcomes are the steps that lead to these outcomes: increased knowledge of MPs, establishment of new parliamentary procedures, increased interaction between MPs and stakeholders, etc. The overall objective is about the permanent impact that a strengthened parliament will have on the society. NB: there is no solid rule for the differentiation between outputs, intermediate outcomes, outcomes and impact. It is up to you to arrive at a logical chain of results.

• Relevant: is it actually a solution to the problem you’ve identified?

The final result should be inserted into a logframe, a mandatory annex to many programme proposals. The logframe summarises the programme objectives, activities and indicators. There is no standard logframe template, but the EU template is both simple to use and adjusted to AWEPA’s type of work. The logframe template with explanations can be found on SharePoint. Part of the intervention logic is the identification of assumptions that underpin why the selected activities lead to the formulated results, as well as the identification of risks that may affect programme implementation and the achievement of results. In the AWEPA Risk Register, you can list all the possible risks and assign their likelihood and possible impact. The Register can be found on SharePoint and will help you systematically monitor and mitigate risks. Risks and assumptions are reviewed at least annually.

Choosing the right formulation for your objectives is important; it should be:

Page 12

AWEPA Handbook - 2016 www.awepa.org


Figure 4: Example intervention logic

Overall objective “Stronger democratic representation in...”

Outcome 1

Outcome 2

“More effective parliamentary legislative workflow...”

“Improved parliamentary oversight on...”

Intermediate outcome 1.1

Intermediate outcome 1.2

“MPs have increased understanding of...”

“Parliamentary staff have improved capacity to...”

Intermediate outcome 2.1 “Public consultation mechnism in place to...”

Activity 1.1.1

Activity 1.1.2

Activity 1.2.1

Activity 2.1.1

Activity 2.1.2

“MP Training on...”

“Study visit to...”

“Staff training on”

“Public outreach

“Lobby for set-up of mechanism’

visit to...”

Photo opposite page: AWEPA in action during the “Peace, Security and Sustainable Development” Parliamentary Seminar in Brussels, 2015. AWEPA Handbook - 2016 www.awepa.org

Page 13


Step 4: Developing a Theory of Change It is good practice to accompany the logframe with a narrative in which you explain how the outputs of the proposed activities lead to intermediate outcomes, and how, in the longer term, these lead to the identified programme outcomes. A method that is particularly adapted to the complex type of changes that AWEPA wants to bring about, is the Theory of Change (ToC). The ToC is a narrative that focuses on non-linear and interdependent processes. To develop a ToC for your programme, include the following elements: • A description of the (political) context and partner needs, and a problem analysis based on these. • An analysis of the changes necessary to tackle the problem(s) identified, explaining how the changes are inter-dependent. Please refer to AWEPA’s overall ToC, which identifies changes on three levels: »» Formal institutions: Laws, regulations and mechanisms that determine the mandate and powers conferred to the parliamentary institution as well as its internal functioning and institutionalisation (Administration, Secretariat, Committee system). »» Actors: The capability of political and civic actors to perform their roles, which depends of their level of knowledge, skills, self-confidence, inspiration, aptitude and attitude. »» Relations between actors: The existence of a culture (social norms) of democratic consultation among political actors and between political and civic actors. Page 14

• Introduction of the changes which can be brought about by AWEPA. It is important here that you make clear that AWEPA will only work in the areas where it has a high added value (compared to other organisation). These should coincide with the outcome(s) in your intervention logic. • Pathways of change: describe for each outcome which intermediate outcomes need to be achieved and which activities need to be put in place. Mention whether certain activities or outcomes also influence other outcomes. Also describe the assumptions on which these are based, as well as risks that may affect success. Summarise each pathway using the terminology IF–THEN–BECAUSE. »» IF MPs increase their understanding of... [see intermediate outcome 1.1] »» AND IF staff have improved capacity to... [see intermediate outcome 1.2] »» THEN the legislative workflow will be more effective... [see Outcome 1] »» BECAUSE more private members’ bills are produced when politicians understand the legislative workflow and are effectively supported by staff [assumption 1] »» BECAUSE training for MPs and staff will make them more confident in scrutinising government bills [assumption 2] • You may want to visualise the ToC by showing the relations between activities and different (intermediate) outcomes using arrows and boxes.

AWEPA Handbook - 2016 www.awepa.org


Step 5: Selecting progress indicators To monitor progress towards the outcomes and intermediate outcomes, it is important to predefine indicators that are evidence of, or exemplary for, the desired changes. The selection of indicators can, however, be challenging. Political change is not linear and impact often only becomes visible in the long term which makes it difficult to expose a particular intervention’s impact. Indicator selection is therefore guided by the following best practices:

Again, the formulation of the indicators is important. Each indicator should be: • Direct: is it actually measuring what you’re trying to measure? • Clear: is it evident what kind of change is taking place? Where and with whom? • Quantitative: Can you count it or quantify it? Or can you answer yes/no as to whether it has been achieved? • Feasible: can you realistically conduct this measurement with the money, people and time at your disposal?

• Involve the parliamentary partner in this process, as selecting indicators is an integral part of programme design. • Focus on outcome and intermediate outcome indicators instead of indicators directly linked to activity outputs. • Balance measurable quantitative indicators (e.g. number of pieces of legislation) with qualitative indicators (e.g. evidence of increased awareness). • Limit the total number of indicators to what is necessary to objectively monitor impact. Less is often better than more. • Consider data-collection when setting indicators, not after. The sources of verification for some indicators may require a more pro-active approach (interviews/focus groups), which needs to be planned in advance. • Both the intermediate outcomes and the indicators can be adjusted throughout the implementation of the programme to reflect changing political realities.

Hon. Bodil Valero, Head of the AWEPA Section in the European Parliament, addresses participants, June 2016.

Step 6: Budgeting To ensure continuous attention to high quality programme design and the monitoring of results and proper evaluation of the outcomes, it is important to allocate sufficient resources for M&E in twhe programme budget, which is typically set during the programme design phase. M&E budget should cover travel for data collection, hiring of external consultants, participatory progress review meetings and external evaluations.

AWEPA facilitates parliamentary outreach visits in Kenya, September 2016. Photo opposite page: AWEPA in action during the Subregional Conference on Female Genital Mutilation in Senegal, 2016. Photo/AWEPA AWEPA Handbook - 2016 www.awepa.org

Page 15


Monitoring and Evaluation Although programme planning and activities are to a large extent defined in the design stage, the dynamics of political processes may call for the adjustment of the intervention logic (i.e. adjusting expectations on outcomes or changing the type of interventions). This requires reliable information on programme progress. AWEPA measures progress on four levels: • Output level: all the direct outputs and characteristics of the activities organised: the number of activities, date and location, the number and type of participants, topics covered, etc. This output data is best collected during and immediately after the activity. • Intermediate Outcome level: the direct results of an activity organised by AWEPA and the small incremental changes and steps that occur between the activities and the final outcomes to which they should contribute. This could be, for example, ‘increased understanding of MPs’, or ‘improved contact between MPs and their constituencies’. The indicators typically range somewhere between the output and outcome levels. Data is collected on a regular basis, at least every six months, and during the preparation of narrative reports.

The measuring of progress at these four levels is based on the ‘logical framework’ approach, i.e. gathering data on indicators and outcomes that have been set in advance. Complementary ‘complexity-aware’ methods are used to also capture unintended and non-linear results. The parliamentary partner should be actively involved in the M&E exercises. This fosters ownership over programme progress but is also a sheer necessity, as parliamentary data can be difficult and time consuming to access. In the below, several key steps are introduced which will help you put in place a robust M&E system for your programme.

Step 1: Developing a Monitoring and Evaluation (M&E) plan M&E starts with the development of a tailored M&E plan for your programme, listing all the logframe indicators and detailing for each indicator the exact definition, the data gathering method, and planning and responsibilities for the data collection. A template M&E plan is available on SharePoint and contains the following items:

• Outcome level: the longer-term impact of programme activities on the functioning of parliament as a whole. This could be, for example, ‘improved oversight’ or ‘more effective legislative workflow’. Results on this level can best be reviewed half-way through the programme and after its completion. Frequency and timing of data collection differs per programme, however, and often depends on reporting periods.

• Indicator: This should help you to list all the indicators for the Impact, Outcome and Intermediate Outcome statements. You may want to assign numbers to keep track of them easily.

• Impact level: the contribution of the programme to broader societal goals such as economic development, human rights and democracy. Impact on this level is measured after the completion of the programme and often relies on third-party data on broader development in society.

• When to collect and Frequency: When should the indicator be measured and how often? To answer this, think of your reporting deadlines but also more practically, what is the most suitable and economic moment to measure the indicator? Typically, indicators on Outcome and Impact levels only need to be measured at the start, at a

• Definition: Everyone should be able to understand exactly what your indicator is about and what it measures. Therefore, a detailed definition of each element of the indicator must be provided. Several sentences may be used if needed.

Photo opposite page: AWEPA contributes to the organisation of a national workshop on Child Marriage Laws in Maputo, Mozambique. 2015. Page 16

AWEPA Handbook - 2016 www.awepa.org


half-way point and at the end of a programme, but sometimes more regular monitoring is needed to prevent information from getting lost. • Source and data collection method: Where will you be able to find the data, and which data collection method is most appropriate? The most common methods include document analysis, internal activity reports, interviews, focus group discussions and questionnaires. These are further explained under Step 4. Again, consider the most practical and economic sources, taking into account the quality of the data. It may be necessary to combine several methods to improve data reliability. • Preparations needed: Here, you can describe whether you need to put in place certain monitoring processes in advance. Take for example the monitoring of newspapers; it would be smart to make this a structural daily routine, to prevent you from going through hundreds of papers at the end of the year. • Responsibility: Indicate who will be responsible for measuring the data.

AWEPA Handbook - 2016 www.awepa.org

Step 2: Carrying out a Baseline study In practice, the baseline study is simply the first measurement you carry out on your indicators. The baseline study measures the indicators at both intermediate, outcome and impact levels. This makes it a considerable investment, but it will enable you to prove progress made during the course of the programme by comparing the ‘scores’ of mid-term and final evaluations with the situation at the start of the project. Ideally, the baseline is executed before the first programme interventions take off. The final baseline report will consist of the indicator values and a qualitative narrative that gives additional information on the status quo regarding each outcome statement and indicator. During the drafting of the M&E plan, you will be able to define which indicators should be part of the baseline measurement and which data sources and methods to use. The study will therefore consist of several separate research components. A template baseline report with guidelines can be found on SharePoint.

Page 17


Step 3: Monitoring activity results on output and intermediate outcome level AWEPA has developed its own methodology to monitor the results of its activities. This is based on the assumption that AWEPA activities influence individuals, groups of individuals and eventually organisations. The influencing process consists of several correlating steps, depicted in Figure 5. AWEPA’s direct sphere of influence lies with exposing individuals to new information, knowledge and best practices, bringing together parliamentary peers and stakeholders and facilitating skills training. The exposure facilitated by AWEPA is expected to contribute to increased knowledge and skills among the participants and increased awareness of selected topics. In the longer run, it is expected to contribute to changing attitudes and behaviour among the participants in their work, and ultimately to a change in general practices in parliament. But do the activities facilitated by AWEPA indeed lead to increased knowledge, skills and awareness? To monitor this, a standardised activity reporting form has been created, to record the direct outputs and longer-term results of activities. The form consists of four parts: • Activity information such as the date, objectives and topics, etc. (outputs). • Scope and reach of the activity, e.g. number and type of participants (outputs). • Quality of the activity, such as changes in knowledge, skills, awareness and attitudes (intermediate outcomes). • Follow-up undertaken after the activity, e.g. changes in behaviour and practices (intermediate outcomes). The first three parts should be filled in immediately following the activity. The last part is completed several months after the activity took place and involves interviews with the activity participants to find out what follow-up has been undertaken. An activity evaluation form may help you gather additional insights through feedback from activity Page 18

participants. A template activity evaluation form is available on Sharepoint. Monitoring knowledge and skills It is not easy to assess whether participants did indeed improve their knowledge and skills on the topic of the training or conference. In order to find out, you can make use of the pre- and post-test template, available on Sharepoint. In Part A, you list the training subjects, asking the participants what they know about a certain topic before and after the training. In Part B, you list some of the key competences and skills that are explained/trained during the activity, to assess whether the participants improved their skills during the activity. You might find that not all MPs are willing to ‘take a test’. Don’t insist if they refuse. Instead, you can carry out informal interviews with a selection of MPs, in which you ask them the same questions before and after the activity. Monitoring awareness and attitudes The Quality section of the activity reporting form analyses the extent to which the activity succeeded in raising awareness among participants on the subject and whether it contributed to changing their attitudes vis-à-vis certain matters. Increased awareness is measured by observing the participants during the activity. If the exposure to information and best-practices has made them arrive at new insights regarding structural challenges in their own functioning or the functioning of parliament in general, it can be assumed that their awareness on these issues has increased. If participants propose solutions and follow-up to address these issues, it is assumed that they have also changed their attitude on these matters. Monitoring follow-up To find out whether the activity has led to any longerterm changes in the way MPs or staff execute their jobs, more research is needed. The annex to the Activity Reporting Form contains guidelines for interviewing activity participants several months after the activity took place. The objective is to ascertain whether these participants have changed their way of working, whether general practices in parliament have altered, and which of the improvements proposed by the participants during the activities have been followed through.

AWEPA Handbook - 2016 www.awepa.org


Figure 5: AWEPA assumptions on steps that facilitate changes in individual and organisational behaviour

»To » information

1. Being exposed

»To » best practices »To » other stakeholders and peer-pressure

2. Gaining knowledge, skills and network

Outputs from AWEPA activities

»Through » exposure to new information, best practices, contacts with other stakeholder and peer-pressure

»» Through exposure

3. Become more aware

»» Through increased knowledge »» Through new skills

Direct intermediate outcomes from AWEPA activities

»» Through enlarged network

Longer-term intermediate outcomes from AWEPA activities

4. Changing individual attitudes

»»Through increased awareness »» Through new knowledge »» Through peer-pressure »» Because of changed attitudes

5. Changing individual behavior

»»Because of increased awareness »» Because of new knowledge »» Because of expanded network

Final outcomes of AWEPA activities

6. Changing collective practices

AWEPA Handbook - 2016 www.awepa.org

»» Because of changed behaviors of individuals »» Because individuals copy exemplary behavior

Page 19


Step 4: Monitoring intermediate outcomes and outcomes Whilst the Activity Reporting Form may assist in collecting information on the results of specific activities, you will also have a range of indicators that cannot be linked immediately to one specific activity. That is, the final outcome of an AWEPA programme often relates to improved functioning of the partner parliament, in terms of its legislative, oversight and representative functions 2 , and measuring results on this level requires additional research. Usually, these outcome indicators are only evaluated at the start, half-way through and at the end of a programme, but may require regular monitoring as well, just like all the intermediate outcome indicators. The data collection method to be used depends on the type of information you are looking for and the time and financial resources available. In general, five data collection methods are most frequently used: • Internal activity reports to gather information on the quality and outputs of activities organised by AWEPA.

The data-collection manuals referred to can also be found on SharePoint. Theyhighlight the pros and cons of each method and contain instructions and templates that can help with preparing the data collection. Among the key sources of information for your data collection are: • Members of Parliaments: can provide insightful information on developments within farliament, relations with other stakeholders and the impact that AWEPA activities have had. • Parliamentary staff: can often provide valuable information about parliamentary practices, committee and plenary proceedings, the parliamentary agenda, private members’ bills, public consultation, etc. • Parliamentary Hansard: include all documents relating to the parliamentary proceedings and can contain valuable information on the functioning of different committees and the legislative workflow. A bill-tracking-system, if this exists, can be very useful for this purpose. • NGOs: Especially those carrying out complementary work with parliaments, in particular parliamentary monitoring organisations (PMOs), but also national research institutes and international organisations.

• Interviews with key informants to gather indepth data on specific issues and collect different opinions. See Annex 3 for the Manual on Key Informant Interviews. • Focus group discussions to brainstorm on a few topics, jointly reconstruct developments and listen to different opinions at the same time. See Annex 4 for a manual on setting up a Focus Group Discussion. • Surveys to quickly gather data and opinions from a large group of respondents. See Annex 5 for a manual on designing Surveys. • Desk research to analyse data from existing resources, such as press statements, the internet, Parliamentary Hansard and academic publications. See Annex 6 for a manual on preparing your Desk Research.

AWEPA’s President, Minister of State Ms. Miet Smet signs a new MoU with the Parliament of Kenya, May 2016.

2. Many other roles carried out by parliament can also be targeted, such as its role in conflict resolution and peace building, in promoting the place of women in society, etc. Photo opposite page: United Nations Archive Page 20

AWEPA Handbook - 2016 www.awepa.org


Step 6: Assessing results using complexity-aware methods Step 5: Monitoring programme impact on the society The contribution made by your programme to broader societal goals such as economic development, human rights and democracy is very difficult to ascertain. Assessing attribution on this level is not realistic. It often suffices to analyse and describe the broader trends in society and indicate what AWEPA did to contribute to those changes. Indicators on this level are measured after the completion of the programme and assessment often relies on third-pwarty data on broader development in society. Useful databases include Afrobarometer, V-Dem, the Institutional Profiles Database, the World Justice Project Rule of Law Index, Freedom House, the Transparency Index and the Bertelsmann Index. These are all accessible online. When using these sources, please take into account that these do not always cover all countries and that they are rarely updated every year. AWEPA Handbook - 2016 www.awepa.org

It is important to look beyond the set of indicators that you identified during the programme design phase and to assess other results and developments to which AWEPA has contributed. With the help of so-called ‘complexity-aware’ methods, unintended and non-linear results can be captured. These exercises often provide you with very interesting stories and quotes about how AWEPA has made a difference. Obviously, this makes good reporting and communications material. Several complexity-aware methods are being piloted, among others the Stakeholder Feedback and Most Significant Change methods. If successful, these will become integral part of the monitoring and evaluation of the results of AWEPA programmes. By systematically recording the impact achieved in its programmes, AWEPA builds-up a track record, which can be used for fundraising and for external communications. The track-record is publicly shared through our monthly e-newsletter, thematic publications and Annual Reports. Page 21


Programme Evaluation A full evaluation of your programme is the best way to assess the results and impact achieved. A Mid-Term Evaluation is often used to look specifically at the relevance of the Theory of Change and intervention logic, whilst a Final Evaluation also assesses unintended results, effectiveness and efficiency of interventions, programme impact and sustainability. Impact and Outcome indicators are assessed to evaluate progress made on the programme objectives. AWEPA has its own evaluation protocols which stipulate how it manages evaluations and who ought to be part of the reference group to assess the evaluation reports. The protocols reflect the high standards set by the OECD DAC evaluation criteria and the guidelines of the Netherlands IOB. It is good practice to have the evaluation executed by an external party, to ensure objectivity and necessary expertise.

Step 1: Establishing a reference group To ensure objectivity of the evaluation, it is good practice to have the evaluation process guided by a group of people. The reference group can consist of colleagues, representatives from donors and partner parliaments, and thematic or geographical experts. The reference group will jointly design the evaluation, select the evaluator(s) and assess the findings. Having such a group in place augments the objectivity of the evaluation process.

Step 2: Determining the evaluation purpose An evaluation can have different objectives. The OECD DAC identifies five aspects of a development programme that should be evaluated: • Relevance: to assess the extent to which the project interventions were consistent with the requirements and the expectations of the beneficiaries and donors. The evaluation assesses the extent to which the programme objectives are relevant or appropriate to the needs of the beneficiaries. • Efficiency: to evaluate how well the project resources (staff hours, money, time and goodwill) were used to achieve the goals of the project. In this light, the evaluation assesses the extent to which the interventions’ outputs were achieved with the lowest possible use of resources/inputs. • Effectiveness: to evaluate the extent to which the direct results of the interventions (outputs) have contributed to achieving results on intermediate and outcome levels, or are expected to do so in the future. • Impact: to evaluate whether the outputs and outcomes of the programme have contributed to broader, more far-reaching changes. Such changes can be intended or unintended and relate to the highest levels of results in the programme’s intervention logic. • Sustainability: to evaluate the likelihood that the expected project outcomes will be maintained in the long-term, specifically institutional and capacity development which plays an important role in determining sustainability. Besides these criteria, you may have additional objectives for your evaluation, such as to review your intervention logic, to promote organisational learning and for internal and external accountability purposes.

Page 22

AWEPA Handbook - 2016 www.awepa.org


Step 3: Designing the evaluation The Terms of Reference for an external evaluation should include, as a minimum, the evaluation objectives, the approach and methodology, the scope of the assignment and the deliverables. A template Terms of Reference is available on SharePoint. An external evaluation can be very expensive, costing between EUR10.000 and EUR50.000 depending on the scope of the exercise, and therefore needs to be planned for during programme design, and in close coordination with the donor. It is good practice to request bidding evaluators to draft an inception report, which will give insight into the proposed approach and methodology. On the basis of the qualifications, experience, the inception report and the financial bid, the reference group should select the most appropriate evaluator, or group of evaluators. The selection process should be recorded for accounting purposes.

Step 4: Collecting the data The evaluator or team of evaluators will carry out the different elements of the evaluation with the help of data-collection methods that have been introduced in the previous chapter. It is important that the methodology and approach used are well documented, so that these can be replicated in future exercises. The reference group is tasked with oversight of the quality and reliability of the data collected by the evaluator(s).

Step 6: Using the evaluation results The findings of the evaluation can be used for several purposes. • Programme logic: the evaluation gives insight into the relevance of your Theory of Change and assumptions, and the effectiveness of the intervention logic. Use the findings and the participatory review with the partners to adjust the programme logic. • Internal learning: make sure the evaluation report is shared with all your colleagues, so that they can learn from the best practices identified in your programme and the challenges encountered. It might be worth organising an afternoon during which the findings may be further discussed. Don’t forget to upload the report to SharePoint, for future reference. The most important lessons will feed into AWEPA’s overall strategy. • Communications: evaluations are rare exercises during which you can obtain an independent objective assessment of the results and impact your interventions are achieving. Use this to demonstrate the value of your programme to donors and partners. Send them the report and, where possible, present the findings in person. Use the opportunity to elucidate how you intend to further improve the programme’s effectiveness and efficiency. • Fundraising: external evaluations are strong reference material for attracting new funding. The findings may be used to demonstrate the efficiency of the organisation or the specific expertise of AWEPA in a certain domain.

Step 5: Participatory reviewing of findings It can be very beneficial to closely involve the parliamentary partner in reviewing the findings of the evaluation. They may be able to explain or elaborate on some of the findings. In addition, they are often very interested to hear about results achieved through the joint programme. Such discussions may also be used to feed back into the planning stage: the defining of programme objectives, activities and indicators for the next year or cycle. AWEPA Handbook - 2016 www.awepa.org

AWEPA in action during the meetings for the AWEPANIMD Strategic Partnership, February 2016. Page 23


AWEPA Handbook - 2016 www.awepa.org

Page 24


Annex 1: Further reading Resources on parliamentary strengthening Rocha Menocal, A. & O’Neil, T. (2012). Mind the Gap: Lessons Learnt and Remaining Challenges in Parliamentary Development. UTV Working Paper 2012:1. Sida. Pp. 34-35. European Commission (2010). Engaging and Supporting Parliaments Worldwide: Strategies and methodologies for EC action in support to parliaments. Includes Parliamentary Pre-assessment framework. Power, G. (2011). The Politics of Parliamentary Strengthening: Understanding Political Incentives and Institutional Behaviour in Parliamentary Support Strategies. Westminster Foundation for Democracy. Mcloughlin, C. (2010). Topic Guide on Political Economy Analysis. Governance and Social Development Resource Centre (GWDRC). Harris, D. (2013). Applied political economy analysis: A problem-driven framework. ODI Politics & Governance programme, Methods and Resources. London: March 2013. Resources on indicator selection for parliamentary strengthening UNDP (2010). Benchmarks and Self-Assessment Frameworks for Democratic Legislatures. A Background Publication prepared for the international conference on Benchmarking and Self-assessment for democratic legislatures. European Commission (2011). Strengthening democracy support to EU Delegations: from performance indicators, knowledge sharing to expert services. European Commission (2014). Mapping and study on performance indicators for EU support to Political Parties. Power, G. and Coleman, O. (2011). The Challenges of Political Programming: International Assistance to Parties and Parliaments. Wiliams, G. (2011). What makes a good governance indicator. Policy Practice Brief 6. GSRDC (2010). Evaluation of Governance Programme Indicators. Helpdesk Research Report. Other assessment methodologies DFID (2005). Helping Parliaments and Legislative Assemblies to work for the Poor: A Guide to the Reform of Key Functions and Responsibilities. USAID (2000). Handbook on Legislative Strengthening - Legislative Strengthening Assessment Framework. IPU (2008). Evaluating Parliament: A Self-Assessment Framework for Parliaments. Burnell, P. et al. (2007). Evaluating Democracy Support: Methods and Experiences. SIDA and Int. IDEA. Community of practice Parliamentary strengthening takes place within an open community of practice. Not only are tools and methods readily available on the net, there are possibilities for engaging with the practitioners’ network through capacity4dev.eu or the online platform agora-parl.org, of which AWEPA is an affiliated organisation.

AWEPA Handbook - 2016 www.awepa.org

Page 25


Annex 2: List of AWEPA M&E Tools with link to SharePoint 1. Guidelines for political context analysis and template 2. AWEPA Parliament Scan (needs-assessment tool) 3. Logical Framework (EU format with explanations) 4. AWEPA Risk Register 5. Template M&E plan 6. Guidelines for a baseline study and baseline report template 7. Activity reporting form to monitor activity outputs and intermediate outcomes 8. Template activity evaluation form 9. Pre- and post-test template to measure knowledge and skills 10. Data collection manual for Key Informant Interviews, Focus Group Discussions, Surveys and Desk Research 11. AWEPA Evaluation Protocols 12. Template Terms of Reference for an (external) evaluation 13. Database of parliamentary indicators 14. Database with sources for baseline, context and needs-assessment studies 15. General AWEPA reporting templates 16. AWEPA local office Organisational Capacity Scan

Annex 3: Manual for Key Informant Interviews Key informant interviews are qualitative in-depth interviews with experts in their field. Interviewees can be experts because of their field of work, academic background or their position in the community. For the most part, the interviews are loosely structured around topic lists. Key informant interviews resemble a conversation among acquaintances. They should not be severely structured, but rather allow a free flow of ideas and information. It is the interviewer’s job to lead the interview so it’s conversation-like, while making sure the key topics are discussed. When to use this tool? The purpose of key informant interviews is to collect in depth data on specific issues. By speaking to several key informants from different backgrounds on a certain issue, it is possible to garner a better general understanding of a certain issue. Key informant interviews are also very useful to make sense or add value to quantitative information. Within AWEPA, Key Informant Interviews are generally used in three different ways: - To add values and explanations to quantitative data that was collected; - To gain an understanding of developments within the political system; - To gain an understanding of developments within civil society.

Page 26

AWEPA Handbook - 2016 www.awepa.org


Practical steps and tips - Key steps 1. Formulate questions based on the indicators on which you need to collect the data and any other subjects that might be of interest. 2. Prepare a short interview guide. A Key Informant Interview is not based on a rigid list of questions. It should rather be a conversation. It however remains important that the interviewer knows what to ask. An interview guide is a collection of the different themes and subjects that need to be discussed. If a specific output is necessary for the indicators, make sure that this is clear for the interviewer. Key questions should be pre-formulated and asked in a similar way to all informants. You can ask the key informant to score certain questions. 3. Select the key informants. Key informants should be selected for their specialised knowledge and unique perspectives on a topic. If you select multiple key informants make sure that you select people who, in theory, could have a difference of opinion. If your questions pertain to political parties make sure that you do not interview only supporters of one party. 4. Conduct the interview. Start by establishing a rapport. In other words, make sure that the interviewee feels comfortable and open to answering your questions. You can do this by telling something about yourself and the research and starting a ‘normal’ conversation with the interviewee before diving into the interview subject. After a rapport has been established, you can go through the different topics. Make sure that you touch upon all the subjects that were included in your interview guide. 5. Analysing the interview. Extrapolate the information relevant for the indicators. Make sure you ‘feed’ the indicators to triangulate the data. In other words do not base your conclusions on only one interview, but make sure to get information through multiple (min. 3) interviews. Additional information gathered in the interviews can also be very useful for programme development and making sense of the quantitative data gathered through other methods Recordings or notes You can take notes during an interview, record it, or both. We recommend doing both. It is not necessary to transcribe the interview; however, it can be very helpful to be able to listen back. Be aware of biases Be aware that the information given by Key Informants can be biased. Interviewees only bring across their own point of view. Therefore, make sure to triangulate factual statements through interviews with other key informants, desk research or other research methods.W

Annex 4: Manual for Focus Group Discussions Focus group discussions are a qualitative research method in which a group of people is asked about their perceptions, opinions, beliefs, and attitudes towards a topic or idea. Questions are asked in an interactive group setting where participants are free to talk with other group members. Discussion is even encouraged; it provides insight into a range of ideas and opinions and the variation that exists in a particular group in terms of beliefs and practices. A focus group discussion should be focused. It does not cover a large range of issues, but explores a few topics in greater detail.

AWEPA Handbook - 2016 www.awepa.org

Page 27


When to use this tool? • Focus Group Discussions are a very powerful tool to assess qualitative questions and are thus used to: • Understand why people change; • Understand if and how a broader societal change has taken place; • Interpret and understand quantitative information. They represent a good way to gather together people from similar backgrounds or experiences to discuss a specific topic of interest and gain more insight into different opinions among the different parties involved in a change process. Practical steps and tips Invite a small group of people (6 to 12 max.) with specialist knowledge or an interest in a particular area to discuss the specific topics in detail. This knowledge can come from personal experience (e.g. in the case of a focus group with civil society) or scholarly expertise (in the case of a focus group about certain laws that have been passed). Topics Because one focus group can in general only discuss about five questions, a well-thought-through selection of subjects should be made in advance. Make sure that you are well aware of all the indicators, thus subjects, which need to be discussed in the session. Key Steps 1. Identify the indicators for which you want to collect data through the focus group exercise. 2. Prepare four or five discussable questions, which cover the indicators. Note that you can also include questions that need to be scored by the participants. 3. Schedule one to two hours with the participants. 4. Invite a maximum of 12 participants to one focus group discussion. 5. Make sure that there is equality and trust between group participants (if necessary, invite men and women separately; think of differences in age, class, religion or ethnicity). 6. Arrange a comfortable place, where interruptions will be limited. 7. Begin with introductions to ensure all individuals present know each other. Agree on some ground rules for the discussion. For example: everyone has a right to speak, no-one has the right answer, please don’t interrupt etc. 8. Introduce the purpose of the meeting and the method that will be used. 9. Facilitate the discussion (see the following section on facilitation). 10. End the session by explaining the next steps in the process and seek their cooperation to be part of further research. Selecting a team Make sure that you have at least two people available during the session: one person to facilitate the sessions and an observer to take notes. Both roles are very important and cannot be carried out by one person at the same time.

Background photo: AWEPA in action during the International Parliamentary Seminar in Brussels, October 2015. Page 28

AWEPA Handbook - 2016 www.awepa.org


Facilitator guidelines • Be well prepared on the purpose and the topics to be discussed. • Involve all participants in the discussion. • Be non-judgmental and open. • Be a good listener; do not interpret answers of participants. • Ensure discussion is open and flexible for the group, but make sure the discussion doesn’t drift off from the topic. • Become more specific in your questions when you get deeper into the topic. • Avoid yes/no questions, because these won’t encourage participants to express their own ideas and views on a topic. • Rephrase your question if an answer given is not an answer to the question asked. • Use pauses; these give time for participants to think and add to what has already been said. • Use probes; asking questions to get clarification on answers given (Could you explain further? Can you give an example? I don’t understand what you meant by? etc.) • Some questions might need to be scored by the participants. Make sure that the scoring steps are clear. Observer guidelines • Write down who says what, in order to see a line in the views/opinions of the different participants in the discussion (see recording and transcription above). • Ensure all topics are fully covered. • Support the facilitator if necessary (involving everyone and covering all topics). Materials needed Notebook and pen, recorder (optional), flipcharts and markers (optional).

Annex 5: Manual for Surveys Surveys are the most widely used method for collecting data. A survey is an interview based on a list of rigid, often multiple choice, questions. The questions are asked in exactly the same way to all the respondents. The questions can be asked face-to-face, over the phone, by e-mail or even online. Surveys are the data collection tool that is used to gather information from large groups of individuals. The method is used to gather quantitative data. When to use this tool? Surveys are used to collect, analyse and interpret the views of a group of people from a target population. The tool is useful when the subjects and questions of the research are already quite clear. For exploratory research more open-question-based methods are useful. In AWEPA, surveys are generally used for the following: - A yearly survey of the main target group such as parliamentarians to gain insights on their view of the processes in and outside of the parliament. - To gain an insight into the viewpoints of certain target groups such as women, youth, minorities and civil society.

AWEPA Handbook - 2016 www.awepa.org

Page 29


Practical steps and tips Key steps 1. Identify the population that you will survey. Make sure you have a clear picture of the group that you will survey. 2. Clarify the purpose of the survey. What questions need to be answered? Make sure that questions based on all the relevant indicators are included here. 3. Choose a survey method. Make a choice regarding whether you want to implement the survey through mail, online, or face–to-face. This may depend on the target group of your survey. 4. Sample if necessary. At the times your target population will exceed your capacity to reach them all. In chapter 6 (page 10) you will find sampling advice for different target groups. 5. Design the survey. In annex 1 (page 11), you will find a survey template. Make sure that questions covering all the relevant indicators are included in the survey. Keep the flow of the survey in mind. You want to lead the respondent through the survey logically. Make sure that you state the question in a way that is clear and only interpretable in one way for all the respondents. Use closed (multiple choice) questions as much as possible. This will make analysis easier. You can add ‘other’ options with an open space to ensure that answers that were not thought of in advance can be included. 6. Test the survey. Make sure you test the survey before you send it out. You can ask someone in your office test the survey or, preferably, ask someone from the target group with whom you have a particularly good relationship to test it. In particular, test whether the questions are clear and understandable. Also make sure that you have an indication of the time it takes to fill out the survey. 7. Revise the survey. On the basis of the test, make revisions to the survey where necessary. 8. Execute the survey. Send out the survey to all the respondents (in your sample). Be aware that getting responses to your survey is not always easy. It may be necessary to send reminders or provide incentives for filling in the survey. If working with incentives make clear that the incentive is in no way dependent on the content of their answers. 9. Analyse the data. Once you have collected the data, it is time to analyse. You can do this in Excel or SPSS. The analyses that need to be done to ‘feed’ the indicators are relatively basic and focus mostly on percentages and averages. If needed, further explanation for these analyses will be provided during the country workshops. Work that pays off Making a good survey can be quite a lot of work. However, the bulk of the work will only need to be done once. With minor adjustments surveys can be used multiple times. Including basic questions Make sure to include questions on age, gender and political affiliation; this will help with the dissemination of the research findings.

Annex 6: Manual for Desk Research Desk research, as the name suggests, is a research technique that is mainly carried out behind a desk. Desk research basically entails collecting data from existing resources, such as press statements, the internet, analytical reports and academic publications. It is often considered a low cost technique, as the main cost of the data collection is usually the time of the researcher. The data can be collected both offline and online.

Page 30

AWEPA Handbook - 2016 www.awepa.org


When to use this tool? A lot of information is readily available, either because it has been researched by others or because it is recorded as part of due process. Through desk research you make use of the available information. A big limitation is, of course, that desk research only can be used when the necessary data is available. Within AWEPA, desk research is generally used in three different ways: - To monitor political and lawmaking processes; - To monitor the public expressions of political parties and parliamentarians; - To make use of secondary data that is collected by other organisations. Practical steps and tips Key steps 1. Make an overview of the data that should come from desk research. 2. Identify from which sources the data is available. This can be the internet, public archives, parliament, research institutes etc. 3. Collect the data from the different sources. 4. Assess whether the quality of the data is sufficient (quality of source, year, bias). If not try to collect the data in a different way. 5. Make sure you record where the information was found. Asking for advice The first step when undertaking any desk research is to make sure that the information is publicly available. Discuss this within your country team. If you are not sure whether or where certain information is available, it can be helpful to ask your key informants whether they have advice. Repeating the collection Make sure that you record in what way and from what source the data was collected. For the measurements to be comparable it is important to, where possible, use the same method and source for multiple measurements. Data quality The quality of secondary data sources is diverse. Where possible try to make use of ‘official’ sources, such as government databases and reputable newspapers. If you find facts in a newspaper article, make sure to check the source. Note: If information for certain indicators is not available through desk research, other methods will need to be selected. The M&E officer can help you identify other ways to collect the data. The most likely option will be Key Informant Interviews.

Back cover photos: AWEPA in action during the “Peace, Security and Sustainable Development” Parliamentary Seminar in Brussels, 2015. AWEPA Handbook - 2016 www.awepa.org

Page 31


AWEPA International

Prins Hendrikkade 48-G 1012AC Amsterdam, the Netherlands t: +31 20 524 5678 f: +31 20 622 0130 e-mail: amsterdam@awepa.org

Belgium

brussels@awepa.org

Mozambique

mozambique@awepa.org

Benin

benin@awepa.org

South Africa

southafrica@awepa.org

Burundi

burundi@awepa.org

South Sudan

southernsudan@awepa.org

DRC

rdc@awepa.org

Tanzania

tanzania@awepa.org

Kenya

kenya@awepa.org

Uganda

uganda@awepa.org

Mali

mali@awepa.org

Find AWEPA on www.awepa.org | www.twitter.com/AWEPA | www.facebook.com/AWEPAInternational


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.