Regeneration Project Evaluation

Page 1

IJNR The International Journal of Neighbourhood Renewal

Neighbourhood Renewal and Sustainable Communities Training Programme

2009 / 2010

Evaluation Toolkit

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


1 INTRODUCING THE GUIDE Evaluation can be useful, exciting and an important knowledge development tool. This evaluation guide has been developed to help make all these things happen. The goal of this evaluation guide is to provide easy-to-use, comprehensive framework for project evaluation. This framework can be used to strengthen evaluation skills and knowledge to assist in the development and implementation of effective project evaluations. 1.1

Why evaluate?

Effective project evaluations can (a) Account for what has been accomplished through project funding. (b) Promote learning about which health promotion strategies work in communities and which don't. (c) Provide feedback to inform decision-making at all levels: community, regional and national (d) Contribute to the body of knowledge about health promotion. (e) Assess the cost-effectiveness of different health promotion strategies. (f) Position high quality projects for future funding opportunities. (g) Increase the effectiveness of project and program management. (h) Contribute to policy development. A good project evaluation provides an extremely useful tool to manage ongoing work, identify successes and plan effectively for new health promotion initiatives. 1.2

Getting started

The Guide to Programme and Project Evaluation: A Participatory Approach provides direction for your work in planning and implementing effective project evaluations. While no single resource can answer all your questions, we hope that the Guide to Project Evaluation.- A Participatory Approach provides you with clear directions. Add to it, adapt it, and customize it to meet your own needs.

1.3

A note on terminology

For many people the language of evaluation is a barrier that prevents them from getting on with the real evaluation work. This guide attempts to avoid this problem by using plain language throughout. To make the guide as practical as possible it includes (a) A framework to guide the step-by-step process of developing effective evaluations. (b) Activities to introduce and plan for project evaluation. (c) Examples demonstrating the application of the evaluation framework to health promotion projects. 2 EVALUATION FOR LEARNING

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


This guide is based on the belief that evaluation can be a useful and positive experience that promotes learning and action. What is learned from project evaluation is as important as what the project produces or creates. 2.1

Participatory evaluation

Participatory evaluation work supports good neighbourhood renewal practice because it is a collaborative approach that builds on strengths and that values the contribution of everyone involved. While there are other approaches to evaluation, a participatory approach seems most consistent with the goals developing good neighbourhood renewal projects and programmes.

2.2 Principles of participatory evaluation Participatory evaluation focuses on learning, success and action. The evaluation must be useful to the people who are doing the work that is being evaluated. The evaluation process is ongoing and includes ways to let all participants use the information from the evaluation throughout the project, not just at the end. Recognition of the progression of change - knowledge, attitudes, skills and behaviour - is built into the evaluation. The project sponsors are responsible for defining the specific project evaluation questions, the indicators of success and realistic timeframes. Participatory evaluation makes it possible to recognize shared interests among those doing the work, the people the work is designed to reach, the project funders and other stakeholders. 2.3

Putting participatory evaluation into practice

Participatory evaluation calls for collaboration among those who share a common interest in a neighbourhood renewal programme. The collaborative process starts at the beginning of a project and continues throughout the life of the project. This type of evaluation is never a one-time, end-ofproject event. Collaboration allows those involved in the project to: (a) (b) (c) (d) (e) (f)

2.4

work in partnership with community groups to do evaluation recognize the experience and expertise of community groups recognize the health outcomes of the project make evaluation questions and findings relevant to all stakeholders increase the acceptability of and support for the evaluation process and outcomes produce more meaningful results that can be used by both programs and projects to learn how to improve the work being done and to influence policy and program directions.

Principles of a participatory approach to evaluation

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


Participatory evaluation encourages a positive experience with the evaluation of neighbourhood renewal activities. Participatory evaluation focuses on learning, success and action. An important question to ask in evaluation is what we learned about what worked and what did not work. Then we need to ask how can we use these learning's to move to action. The people and groups most directly involved decide what determines success. The evaluation is useful to the people who are doing the work that is being evaluated. The project's goals and objectives - what the project intends to accomplish - must be the standards against which the project work is measured. Evaluators must pay special attention to the project's specific needs and available resources. The evaluation process is ongoing and includes ways to let all participants use the information from the evaluation throughout the project, not just at the end. The material produced for the evaluation must be given back to the participants on an ongoing basis in a format that is useful and clearly written in plain language. Recognition of the progression of change - in knowledge, attitudes, skills and behaviour - is built into the evaluation. To measure people's success in changing knowledge, attitudes, skills and behaviour, think in advance about the kinds of changes the project strategies and activities can produce. It is important to describe how these changes can be recognized and measured in a way that is possible and practical within the timeframe and resources available to the project. The project sponsors are responsible for defining the specific project evaluation questions, the indicators of success and realistic timeframes. Community sponsors of projects must participate in decisions about what questions will be asked and what information will be collected to measure the difference, the work made in a given period. Participatory evaluation makes it possible to recognize shared interests among those doing the work, the people the work is designed to reach, the project funders and other stakeholders.

The evaluation must include information and input from the people doing the work, the people who the work is designed to help or reach and the project funders. 3 A FRAMEWORK FOR PROJECT EVALUATION Project evaluation is challenging work because of the great diversity in the types of projects funded. To be effective, an evaluation framework must respect and respond to this diversity. It must also provide a consistent and common process that applies across projects, ensures accountability and produces evidence-based results that promote learning about what contributes to better health practices for Canadians. The evaluation framework presented in this guide meets this challenge. It is composed of two parts: (a) Five key evaluation questions

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


(b) Five evaluation process steps The five evaluation questions form the core of the framework and can be applied to all types of project activities. The five process evaluation steps outline a systematic approach to the tasks that projects need to complete to answer the evaluation questions. Groups work through the steps to plan and implement the evaluation. The following two sections discuss the evaluation questions and the process steps. 3.1

The five key evaluation questions

The process of developing the answers to the evaluation questions will vary, as each project varies, but the five fundamental questions remain the same. What?

1. Did we do what we said we would do?

Why?

2. What did we learn about what worked and what didn't work?

So what?

3. What difference did it make that we did this work?

Now what?

4. What could we do differently?

Then what?

5. How do we plan to use evaluation findings for continuous learning?

3.2.

Did we do what we said we would do? "What?" (Description of activities)

The responses to this question describe the work done in the project and the relevance of this work in meeting the project goals and objectives. The project success indicators provide the criteria against which success is measured. They assist the project sponsor to collect the information needed to answer this and subsequent evaluation questions. Some of the more specific questions that may need to be answered to describe the project work include the following: (a) What activities were undertaken and how did they link to meeting the project goals and objectives? (b) What were the major achievements of the project and what resources did they require? (c) If the objectives changed during the course of the project, how and why did they change? 3.3.

What did we learn about what worked and what didn't work? "Why?" (Reasons for success)

Participatory evaluation focuses on success, learning and action. Finding out what worked well in a project and what didn't work well practices this principle. Here are some of the questions that could be included in this discussion: (a) What strategies worked well for involving the target population in the project. Why? (b) What strategies didn't work well for involving the target population in the project. Why?

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


(c) What strategies worked best for broadening the base of community support for the project. Why? (d) What strategies didn't work well for broadening the base of community support for the project. Why? (e) Which activities and strategies did we change. Why? (f) What was learned about the relative cost-effectiveness and efficiency of various project strategies and activities? (g) How realistic and relevant were the project goals and objectives? (h) In what ways did the project planning process work most effectively? (i)What did we learn about working together as a group? 3.4.

What difference did it make that we did this work? "So what?" (Impact)

The answers to this question measure a project's success in changing knowledge, attitudes, skills and behaviour. The project success indicators represent the group's assumptions about what changes should be expected from the project work and provide the criteria against which to measure change both during and at the end of the project. There are two main ways project sponsors can assess impact: by using summarized data related to the success indicators and by asking specific impact questions of people who were involved in the project and who were the target of the project's work. The following types of questions may be helpful in discussions about this part of the project evaluation: (a) (b) (c) (d) (e) (f) (g) (h) (i) (j)

What changed as a result of the project? Were there any unexpected changes resulting from the project work? Describe them. In what ways did this project contribute to increased public participation? In what ways did this project help to strengthen community groups? To what extent did the project reduce barriers to health? What evidence is there to attribute any of the above changes to the project? What other factors outside the project might have contributed to the changes? Were other initiatives started, alternative services proposed or new funding resources acquired as a result of this project? In what ways did this project contribute to better neighbourhood renewal practices? What new partnerships developed from this project? What was the nature of the partnerships and what was their contribution? Is the model or approach continuing beyond the initial funding?

(k)To what extent is this model or approach transferable to other communities? 3.5.

What could we do differently? "Now what?" (Future of this and other projects)

Evaluation is for learning and often the best learning comes from examining the challenges that projects present. Here are some of the questions that could be included in this discussion:

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


(a) What more effective methods for achieving the objectives emerged from the work? (b) What additional knowledge development is required to do the work more effectively? (c) What additional support from the funders and community sponsoring agencies would have been useful to the project in meeting its goals and objectives? (d) Are there more cost-effective ways to achieve the project's objectives? (e) Who else could have been involved in the work? (f) What could we do to expand the network of people involved in working on this issue(? (g) Were all the project's needs met? (h) Is there a better way of developing realistic project goals and objectives in the initial planning stage? (i) How did management and administrative systems change through the project to become more effective? 3.6. How do we plan to use evaluation findings for continuous learning? "Then what" (Use of evaluation results) Participatory evaluation includes ways to use the evaluation results throughout the project as well as at the end. Some questions to consider in developing the evaluation are as follows: (a) How were evaluation findings used on an ongoing basis to contribute to the planning and implementation of the project strategies and activities? (b) How will project findings be used for future knowledge development? (c) How will the final evaluation learnings be documented and distributed? (d) Are there alternative ways to present the evaluation results so that more people can make use of the learnings? (e) How will the evaluation results be used for new project planning? (f) How will the evaluation results be used to influence policy and research priorities? Seeking answers to the five key evaluation questions will guide the evaluation process throughout a project. The learnings from answering the questions can then be used to shape current and future work. 3.7

THE FIVE KEY EVALUATION STEPS

The steps to developing answers for the five key evaluation questions are briefly outlined below, and then are further developed in the next five chapters of the guide. A.

Define the project work.

To evaluate a project there must be clear, measurable project goals and objectives that outline what the project plans to accomplish. While this may seem self-evident, many evaluations have gone off the track because this initial work has not been done. B.

Develop success indicators and their measures.

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


The process of defining what constitutes success for a project is another important step in developing evaluations. Project sponsors need to define the success indicators for their projects. The success indicators allow project sponsors to evaluate whether they accomplished what they set out to do and what the impact of their project has been. C.

Collect the evaluation data.

After the first two steps have been completed, it is necessary to decide (a) what information the project needs to collect (b) who has the information (c) how the information will be collected. D.

Analyse and interpret the data.

As the evaluation data is collected, it should be summarized and analysed and key learnings should be identified. This ongoing process will help projects prepare their final evaluation reports. E.

Use the evaluation results.

Evaluation findings can be used throughout the project to improve the planning and implementing of project activities. By sharing project results with others, each project adds to the body of knowledge about health promotion. Working through these five steps will provide project sponsors with the information and tools they need to answer the five key evaluation questions. For small projects with limited resources, the process will be simple and straightforward. For large projects with greater resources, the work involved in each step will vary to reflect the complexity of project goals and objectives. For all projects, project sponsors should: (a) set realistic limits on the number of project-specific evaluation questions and on the amount of evaluation information to be collected, as determined by the evaluation resources available to the group (b) remember that the quality of information collected, not the quantity, is the most important factor in evaluation. 3.8

Tools for using the evaluation framework

To help in applying the evaluation framework, several different tools have been which include: (a) (b) (c) (d) (e) 3.9

needs assessments education and awareness resource development skills development developing innovative models

A framework for project evaluation

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


This overview is a useful tool that can be used for (a) introducing the framework (b) reviewing the evaluation work 3.10 Steps in the Evaluation Process 1. Define the project work – clear, measurable project goals and objectives 2. Develop success indicators process for indentifying indicators and their measures 3. Collect Evaluation Data - examples of how to do this include: (a) Written questionnaire (b) Telephone survey (c) Reaction sheet (d) Interview - face to-face or phone (e) Focus group (f) Participant - observation (g) Project diary (h) Program records (i) Before and after questionnaires (j) non-traditional methods of documentation 4 DEFINING PROJECT WORK Evaluation isn't something that happens at the end of a project. It is a process that begins when the project begins with the development of goals and objectives, and it continues throughout the life of the project. It is through the evaluation process that we learn whether projects are meeting their goals and having an impact on the attitudes and health practices of Canadians. 4.1

Developing project goals and objectives

The project goals and objectives describe what the project wants to accomplish and provide the context in which the five evaluation questions are answered. If the project goals and objectives are not clear, it will be very difficult to answer the first evaluation question, "Did we do what we said we would do?" Goals are general statements of what a project is trying to do.

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


Objectives are specific, measurable statements of the desired change(s) that a project intends to accomplish by a given time. 4.2

Writing Project Objectives

Clear project objectives are essential to project work and effective evaluation. Good project objectives set the groundwork for demonstrating the impact of the project. Writing project objectives, however, can be challenging for many groups. Many people confuse objectives with activities. For example, a project may state that their objective is to create a video explaining how Swine Flu is transmitted. Creating a video is an activity. The objective the activity wishes to achieve is an increase in knowledge on how Swine Flu is transmitted. There are two helpful guidelines to use in writing good project objectives: (1) identify the specific changes the project is designed to accomplish, and (2) ensure that these changes are measurable. To help identify the specific project objectives, it is useful to ask the question: What are we trying to change? Projects generally focus on change in the following key areas: (a) (b) (c) (d)

Knowledge (increasing knowledge on a particular issue or subject) Attitudes (creating an attitude that favours, a desired behaviour) Skills (developing the individual capacity to adopt a given behaviour) Behaviour (maintaining or adopting a healthy behaviour)

These key areas may be seen as a kind of continuum of change. A change in knowledge can lead to new attitudes. Developing skills can enable people to make positive changes in their behaviour. Once the areas of change have been identified, it is important to ensure that they are measurable. There are five important elements to consider when creating project objectives that are specific and measurable. These elements are listed below in random order: (a) (b) (c) (d) (e)

4.3

The date by which the change will occur. The specific change desired (use action verb). A measure (number or percentage). The target group. The location Although their use may vary from one project to another, a good rule of thumb is to write project objectives that include these five elements.

Role of the outside evaluator

In small projects with limited resources, the evaluation can usually be done by the project sponsors. Larger projects, having correspondingly larger evaluation requirements, often hire an outside evaluator. If an outside evaluator is being used, it is essential that project sponsors clarify the evaluator's roles and responsibilities. Questions to consider when hiring an outside evaluator:

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


(a) What will be the relationship between the project sponsor and the outside evaluator? (b) What work will the evaluator be responsible for? A detailed workplan should be agreed upon in advance. (c) What credentials and experience will be required of the evaluator? (d) How will the evaluator be informed of and held accountable to the evaluation framework and the principles on which it is based? (e) How does the project sponsor plan to handle any disputes with outside evaluators? To assist in the effective use of outside evaluators, it is helpful to have the following information available: (a) A list of possible evaluators, including their profiles: what their strengths and weaknesses are, projects they have worked on, any experience working with them previously. (b) Ideas on different roles for outside evaluators, e.g. working with project sponsors to develop the evaluation plan, developing some or all of the data collection tools, analysing the data, writing the summary reports (c) Sample contracts with outside evaluators (d) Guidelines on when to use outside evaluators for projects. 5 DEVELOPING SUCCESS INDICATORS Identifying what success will look like during the developmental phase of a project may seem a little like putting the cart before the horse. Many project sponsors spend a lot of time developing goals and objectives, planning activities and thinking about budgets. The real challenge is to think to the end of the project and name the identifiable changes that they expect to occur as a result of doing the work. These identifiable changes, the success indicators, should be developed as soon as clear project goals and objectives have been established. Therefore, identifying success indicators is the second step in the process of planning high quality project evaluation plans. Project sponsors should identify the success indicators that are most appropriate and best reflect the reality of their own projects. 5.1

Purpose of success indicators and their measures

Success indicators are a group's assumptions about what changes should be expected from doing the project work. These indicators are quantified by specific measures for example, a number, a percentage or a level of satisfaction. Success indicators and their measures need to link directly to project goals and objectives since they provide the objective and measurable criteria by which groups judge the degree of success they have had in reaching their goals and objectives. Through their project activities, project sponsors attempt to change the knowledge, attitudes, behaviour or skills of a selected group of people - sometimes referred to as the target group. To measure or evaluate the amount of change, it is useful to know the status of the target group's knowledge, attitudes, behaviour and skills at the beginning of the project. Determining this initial status or starting point is called setting a benchmark.

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


5.2

The process of developing success indicators and their measures

Choosing which indicators are the "best" is not an exact science. The process that project sponsors go through to identify their success indicators is as important as the final list of indicators created. Done well, this process can contribute to the building of commitment and excitement for doing an evaluation. It also helps groups develop reasonable expectations of what can be achieved. Some guidelines for developing success indicators Success indicators should: 1. 2. 3. 4. 5. 6.

Be results-focused i.e. refer to results or outcomes of the funded activity and not the activity itself Be challenging but feasible. Involve a meaningful comparison - a comparison over time, a comparison with other similar activities or a comparison against a reasonable standard. Be measurable, using quantitative or qualitative measures. In developing indicators, consideration should be given to data availability and data collection. Refer to a result or outcome that can be reasonably attributed to the project activity. hh Be as valid (directly related to the work done and not attributable to other factors) and reliable (able to be replicated) as possible.

Benefits of developing good success indicators: (a) Clarification of project goals and objectives to make them measurable (b) Identification of innovative success indicators that reflect unique community characteristics and needs (c) Strengthened strategies and workplans to address some of the identified barriers to success (d) Increased commitment to assess impact questions 6 COLLECTING EVALUATION DATA 6.1 Introduction Participatory evaluation relies on a systematic and rigorous collection of information from project staff and stakeholders. It draws on both quantitative and qualitative data to measure success and to clarify and make decisions about project characteristics, activities and effects. Determining information collection determining evaluation information needs:

needs

3hree

questions

1. What information is needed? 2. Who has the information? 3. How will the information be collected?

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach

to

ask

in


1.

What information is needed?

Projects need to collect evaluation information that will provide answers to the five key evaluation questions. The specific type of information to be collected is determined by the work done at the beginning of each project to define the project goals, objectives and success indicators. 2.

Who has the information?

Depending on the nature of the project, the people with information useful to the project evaluation will vary widely. People from whom it may be important to collect information include (a) (b) (c) (d) (e) (f) (g) (h) 3.

Project sponsors, staff and volunteers Program consultants Target population Consumers of the service The General public Advisory committee members Other service providers Partners associated with the project.

How will the information be collected?

Project sponsors decide how best to collect evaluation information based on their project's needs and resources. Designing the information collection tools should be done in collaboration with the people who will be using them. Most community projects don't have the time or the resources to put into extensive recording of data and statistics. The goal is to find ways of collecting information that do not put too much of a burden on the people doing the project work but that still provide the information required to answer the evaluation questions. Characteristics of a good information collection processare: (a) (b) (c) (d) (e) (f) (g) 6.2

useful practical collaborative systematic ongoing accurate ethical

Information collection tools

There is a wide variety of information collection tools that can be used depending on the project's evaluation needs. Examples of tools that have been used in other projects are listed below. (a) Written survey questionnaire (b) Structured questionnaires used to reach large numbers of people

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


(c) Provides quantitative data (numbers) that can be statistically analysed and qualitative information that can be summarized (d) Used to survey target population in terms of knowledge, attitudes, beliefs and behaviour. Written Survey Questionnaire Tips and cautions: (a) When developing the questions for the questionnaire, ensure that they are not worded in ways that lead to biased or misleading responses. (b) While mass mailing of survey questionnaires has the advantage of reaching large numbers of can ask for the same types of information as the written survey questionnaire Telephone Interview Tips and cautions: (a) Telephone interviewers may face resistance from people who are tired of answering this type of call. (b) Ensuring that the respondent is provided with clear information on the credibility of the group doing the survey may increase the response rate. (c) Finding a convenient time for the respondent to answer the survey questions may increase the response rate. Reaction sheet This s a simple kind of questionnaire that asks questions about people's satisfaction with a particular activity. It is easy and fast to administer and summarize and a useful tool for getting an immediate response to new resource materials, workshop models and public education events. Tips and cautions: (a) Avoid using leading questions that prompt positive responses. Instead of asking, "Did you enjoy the workshop?" etc. (b) Limit the number of questions to increase the response rate. (c) Include open-ended questions to obtain qualitative data. Individual Interviews individual interviews structured around a set of open-ended questions that are developed to guide the interview and to provide consistency in the information collected. They are a useful method for getting in-depth information on project activities and provides an opportunity to clarify responses and probe for further information. Tips and cautions:

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


(a) This tool can be used with a specific group of people (e.g., project staff to gather their opinions about the strengths and weaknesses of the project) (b) It can also be used with key informants who are knowledgeable about the project (e.g., frontline service providers. (c) It is a good method to use with respondents who have low literacy levels and might be uncomfortable with written data collection tools. (d) The interviewer needs to be trained not to bias the responses through the use of leading questions. Telephone interview This is a similar process and function as face-to-face interview but conducted by phone and less expensive to administer than face-to-face interviews. Tips and cautions: (a) Sending the respondent a copy of the interview guide in advance may promote a more thoughtful discussion. (b) Interviews, both in person and by phone, are an alternative to focus groups when you want to avoid group influences on the responses given. Focus group This is a group discussion in which 10 to 12 people are brought together in a single session of approximately an hour to generate ideas and suggest strategies. It is facilitated using a specific agenda of structured questions, similar to the interview guide, that focuses the discussion in the meeting. It is used to obtain in-depth understanding of attitudes, behaviour, impressions and insights (qualitative data) on a variety of issues from a group of people, e.g., project staff or a project advisory committee. Tips and cautions: (a) The facilitator must remain neutral and non-judgmental and have the skills to keep the discussion moving and on track. (b) This is a particularly useful method for reflecting on evaluation findings and identifying key learnings. Participant - observation This involves actual observation rather than asking questions and is used to better understand behaviours, the social context in which they arise and the meanings that individuals attach to them. Observers compile field notes describing what they observe; the analysis focuses on what happened and why. Tips and cautions:

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


(a) This may be the most feasible way to collect data from some hard-to-reach populations (b) As with all qualitative techniques, the results may not be fully generalizable to the entire study population. Project diary With a project diary, project managers, staff or participants are asked to keep a record of their experiences while working on the project and it can provides qualitative evaluation data. Tips and cautions: (a) It is important to provide the participants with clear guidelines on keeping a log book: (b) This is a useful method for identifying unintended consequences of a project. (c) Some people are very uncomfortable with this method because of the unstructured nature of the writing required. Program documentation This includes an analysis of written records (minutes of meetings, telephone logs, intake forms, policy directives, financial records, attendance records) and can provide information on people's interests, preferences and patterns of usage of services and service locations. It can often, through systematic review, provide important evaluation information, both quantitative and qualitative. Also it is an inexpensive source of information. Tips and cautions: (a) This tool is limited in that records document only existing alternatives, they don't show other needs, wants or preferences. (b) It is important to identify evaluation information needs at the beginning of a project to ensure that the necessary records are kept.

Non-traditional methods of documentation Non-verbal or non-written evaluation tools used to respect diversity and accessibility issues and examples include cartooning, drawing, poster making, photography, videotaping, audio taping, scrapbooks. Tips and cautions: (a) Qualitative data collected may be difficult to analyse and generalize. (b) This is a useful method for getting responses from respondents who are uncomfortable with written tools. No single evaluation tool can provide all the evaluation information required. A combination of

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


different tools that suit the project needs and available resources has to be developed. Regardless of which tools are selected, they should reflect the following tips to be effective.

7

ANALYSING AND INTERPRETING DATA

7.1 Introduction Most evaluation projects have no problem with collecting large amounts of evaluation information. What they sometimes do have difficulty with is effectively analysing, summarizing and using the results. The emphasis throughout this guide is on evaluation for learning and action. This section focuses on practical ways that people at the national, regional and community levels can turn evaluation information into usable, accessible summaries and reports that add to the body of knowledge about project success and promote change in attitudes, skills and behaviour. Committing adequate resources at all levels to do the evaluation work is essential if everyone is to benefit from the valuable learnings that can be gained from evaluating health promotion projects. 7.2

Analysing project evaluation information

Analysing evaluation information begins with a review of all the collected data to find the emerging themes or patterns. The five key evaluation questions provide useful categories around which to group information and develop the themes. Look for and record the information that is in the data about how well the project is doing, what is working, what should be done differently and what difference it is making. Project sponsors may want to record notes on the data on file cards or sheets of paper - one for each question, issue or topic. This makes it possible to see the emerging patterns more easily. Include exact quotations from the interviews and questionnaires. It is essential to stay with what people have said and let the data guide the analysis. Too much detail is better at this stage than not enough. It is always easier to cut down than to add information later. Once the material has been grouped into themes, it can be analysed to see how the results compare to the changes that were expected as identified by the success indicators. Take the time to reflect on what the analysis reveals. What was learned to answer the "what", "why", "so what", "now what" and "then what" evaluation questions? People who have been involved in the project should be involved in the interpretation of the findings. Project sponsors or the project evaluator should prepare short summaries of the key learnings from the analysis on a regular basis - for example, every three months or after each project activity. The importance of preparing these brief summaries, which highlight two or three key learnings, cannot be overemphasized. The summaries provide an excellent means of letting the key players in the project know about and begin to use the evaluation findings throughout the project - one of the basic principles of participatory evaluation. By completing summaries of key learnings at regular intervals, the work at the end of the project will be greatly reduced.

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


Summary - Analysing evaluation information (a) Review the collected evaluation material for emerging themes and patterns. (b) Use the key evaluation questions to group the material into themes. (c) Analyse the material by themes, comparing the results to the changes that were expected as identified by the success indicators. (d) Reflect on what the analysis means. Ask other key project players for their interpretations. (e) Prepare short summaries of key learnings under each theme. (f) Prepare summaries of key learnings on an ongoing basis. (g) Submit the summaries to the participants for their feedback and verification of the findings. (h) Develop the final analysis. Analysis of quantitative data Quantitative data looks at the incidence and quantity of events. Data gathered through quantitative methods (surveys, questionnaires, administrative records) is numerical and may be analysed by calculating averages, ranges, percentages and proportions. Descriptive statistics simply account for what is happening in numerical terms. For example, when evaluating the use of a needle exchange system, an estimate may be made of the average number of people using the facility each week or the percentage of users returning needles. Bar charts, pie charts, graphs and tables can be effective ways to present the statistical analysis in a clear and concise manner. Analysis of qualitative data Qualitative data is information that is primarily expressed in terms of themes, ideas, events, personalities, histories, etc. Data is gathered through methods of observation, interviewing and document analysis. These results cannot be measured exactly, but must be interpreted and organized into themes or categories. The primary purpose of qualitative data is to provide information to the people involved in the project. This standard of usefulness is an important one to keep in mind when analysing qualitative data. Note: Neither the quantitative nor the qualitative approach to the collection and analysis of data is inherently superior. Each has advantages and disadvantages. For both, it is important to know the context within which they have been used in order to understand the analysis. Whenever possible, project evaluations should include several types of information collection tools. The analysis and summaries of key learnings should draw on information collected from all of them. 7.3

Preparing useful evaluation reports

Once the evaluation information has been analysed, the next challenge is to present the learnings in ways that are both informative and interesting. The brief summaries of key leamings, described in the preceding section, are often all that is needed to provide information on an interim basis. However, the final project report requires more data. The next section provides some ideas that might be useful for clarifying the expectations about the final report with project sponsors. Evaluation report outline

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


Having an outline at the beginning of a project about how the final report will be developed is extremely useful. It helps shape the thinking about what information is needed and how it will be collected, analysed and used. There are two questions to consider when planning evaluation reports. Who is writing the report? Small projects with very limited resources should have different expectations placed on them than larger projects or projects with funding for an external evaluator. Who is the report for? While every evaluation report should be written in an interesting and clear style, the structure and emphasis of the report may vary depending on who it is for. For example, is it intended primarily for the funder or for the project participants? The former might focus more heavily on learnings about cost-effectiveness strategies; the latter might be more interested in leamings about how to implement a specific health promotion activity. The following sections form the basic structure - the bare bones - of an evaluation report. Personal stories and quotations from the project participants put a human face on the evaluation results and can make the report much more interesting and user-friendly. Groups can adapt and build on the following guidelines to develop evaluation reports that reflect the unique nature of specific projects. Example of an outline for a project evaluation report Section 1:

Executive Summary This section is for people who are too busy to read the whole report. - One page is best - never more than three. - It comes first but is the last piece written. - It usually looks at what was evaluated and why and lists the major conclusions and recommendations.

Section 2:

Background Information - Getting started This section provides background leading up to the evaluation: - how the project was conceived why it was needed the project goals and objectives who was involved in the work - the project organizational structures.

Section 3:

Description of the Evaluation - How we learned This section describes - the evaluation approach and how it was chosen

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


evaluation goals and objectives how the evaluator was selected and managed - how the data collection tools were designed and used how well the data collection tools worked any limitations of the methodology - how people were selected to be interviewed, or to receive questionnaires, etc. - who did the interviewing, the number of people interviewed and their situation - how questionnaires were distributed and returned. Section 4:

Evaluation Results - What we learned One way to organize this section is around the first four evaluation questions: Did we do what we said we would do?

Outline goals and objectives of the project. - Record what happened as a result of the project - e.g., resources developed, training sessions completed, etc. - Describe the changes that occurred in relation to the success indicators.

What did we learn about what worked and what didn't work?

- Outline key learnings from the project about making things work. Examples: producing effective resource materials, structuring productive advisory committees, conducting needs assessments in rural and isolated communities, building community ownership of health promotion projects, etc. - Identify learnings about what strategies didn't work and why.

What difference did it make that we did this work? (outcomes)

- Outline results from the evaluation that show how the project made a difference to consumers, project sponsors and the wider community. - Identify any changes - of attitudes, knowledge, skills or behaviour that occurred from the project work, e.g., how health practices

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


have improved. - If appropriate, show how the project contributed to increased public participation and strengthened community groups. - Include personal statements and anecdotal material from project evaluations which illustrate the impact an activity has had on project participants. Example: "One thing I plan to use right away in my work which I got from the training is..."

What could we do differently?

- List leanings from the projects about different ways to do the work. Examples: improving the cost-effectiveness of projects, adapting the project model to make it more responsive to volunteers, changing the reporting role for outside evaluators to improve accountability, etc. - Reflect on cautions and challenges about doing the project work. Section 5:

Conclusions and Recommendations Final thoughts on what we would like others to know - Conclude with a summary of the work done and how well the goals and objectives were reached. Include recommendations for further work. - Include recommendations on how the evaluation results can be used.

Section 6:

Appendices - These may include copies of questionnaires or interview schedules, statistical information, program documents or other reference material important to the evaluation but not important enough to go into the text. - It is useful to include a bibliography - list of the sources used to compile the evaluation results, other research studies and articles. A list of who was interviewed or organizations contacted may also be included.

8 USING EVALUATION RESULTS The fifth and final key evaluation question in the framework is, "How do we plan to use evaluation findings for continuous learning?" This is a question that needs to be considered at the very beginning of a project and not only at the end, as is often the case. Having ideas at the start of a project about uses for the evaluation findings helps ensure that the evaluation is conducted and the results reported in a way that meets people's needs. If key stakeholders are involved from the beginning, it increases their support for the process and their likelihood of using the results as they become available.

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


There are several major ways in which project evaluations can be used to maximize their benefit, A few ideas are listed below. 8.1

Using evaluation results

Bring together all project staff on a quarterly basis to discuss the evaluation results and look at ways the results can be used to increase performance, improve project administration, enhance planning activities, etc. Present the report orally to staff, funders and community members. Develop a news release outlining the main leanings from the evaluation and some of its more important conclusions. Send the news releases to key community contacts and evaluation participants. Involve project participants in developing ways to present the project findings. Build on their stories and personal experiences to give a human face and to create interest in the evaluation results. Make a presentation on the evaluation results to the local health council or social planning group, highlighting the accomplishments and describing how the results can be used to promote better planning. Use the evaluation results to shape requests for new or continued funding or for suggesting alternative health practice models. At the start of each new project proposal process, review evaluations from past projects to discover what learnings are transferable. Send a letter thanking all project participants for their work on the project and include a summary copy of the key evaluation results. Develop a short video of project participants discussing what they learned from the project. Use it to promote the project with community groups and with funders. Build the evaluation results into presentations to local service clubs to show how their funding support could be effectively used. Commit 15 minutes of time at meetings to information sharing about key leanings from project evaluations. Extract highlights of project evaluation reports and distribute them regularly. Develop a workshop to present the project evaluation results at a regional or national conference of health promotion professionals. Identify other projects that are doing related work and share evaluation reports with them. Organize a brainstorming session involving staff to come up with creative ideas to document and promote project successes.

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


Develop a user-friendly yearly summary of key evaluation results from across projects. Include ideas for using the results to strengthen planning and distribute the summary to key stakeholders. Make presentations to other health care practitioners, using project evaluation results to show how they can benefit from involvement in health promotion work. Systematically review and summarize all project evaluation results on a twice-yearly basis. Use the evidence-based outcomes to develop and improve health practice models. 9 PUTTING IT TOGETHER This section provides a checklist to use throughout the project evaluation process. Developing the project evaluation Have you used the principles of participatory evaluation? Have you reviewed the evaluation framework and example worksheet? Have you identified the evaluation resources required to plan and carry out the evaluation? Have you discussed the roles and responsibilities of those involved in the evaluation? Reviewing the project evaluation plan Are the project goals clear and realistic? Are the project objectives specific and measurable? Are the project goals consistent with the overall goals of the funding program? Is an evaluation framework prepared and included in the plan? Does the evaluation proposal demonstrate a process that will provide information to answer the five key evaluation questions? Does the evaluation proposal demonstrate a participatory process that includes others, eg., target group members? Are the success indicators for the project identified in clear, measurable terms? Is there a practical outline of how evaluation information will be collected and from whom? Does the proposal give ideas on how the evaluation results will be used both throughout the project and at the end? Assisting and monitoring project evaluation work Is the project sponsor regularly informed of evaluation findings? Have the roles and responsibilities for reporting purposes been negotiated for the project sponsor and the outside evaluator?

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


Does the final evaluation report address all five evaluation questions? Using project evaluation results Is there a plan in place for identifying different ways to share evaluation information? Are project evaluation results being used to contribute to future project planning? Common Evaluation Terms and What They Mean evaluation - a way of measuring if a project is doing what it says it will do. goals - general statements of what an organization is trying to do. objectives - specific, measurable statements of what an organization wants to accomplish by a given point in time. objective approach - an approach which values the perspective, views and opinions of those outside of or distanced from the situation, event, organization, project, etc., as the primary basis for making an assessment or judgment. informant - in research and evaluation terminology, the person you inter-view or question is called the "informant". impact or outcome evaluation -gathers information related to the anticipated results, or changes in participants, to determine if these did indeed occur. It may also be used to test the effectiveness of a new program relative to the results of an existing form of service. An impact evaluation will tell you about the effects of a project. process or formative evaluation -an ongoing dynamic process where information is added continuously (typically using a qualitative approach), organized systematically and analysed periodically during the evaluation period. A process evaluation will tell you how the project is operating. quantitative approach - an approach that tries to determine cause and effect relationships in a program. A quantitative approach will use measurements, numbers and statistics to compare program results. The information that is found is considered "hard" data. qualitative approach - an approach that examines the qualities of a program using a number of methods. This approach uses non-numerical information - words, thoughts and phrases from program participants, staff and people in the community - to try and understand the meaning of a program and its outcome. The information that is found is considered "soft" data.

IJNR A Guide to Programme and Project Evaluation: A Participatory Approach


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.