A critical review of Program and Project Evaluation Models Roberto Linzalone * Dipartimento di Matematica, Informatica ed Economia, DIMIE Università degli Studi della Basilicata Via dell’Ateneo Lucano, 10 – 85100 Potenza, Italia Institute of Knowledge Asset Management IKAM Centro Studi&Ricerche Via D. Schiavone, 1 – 75100 Matera, Italia
Giovanni Schiuma Dipartimento di Matematica, Informatica ed Economia, DIMIE Università degli Studi della Basilicata Via dell’Ateneo Lucano, 10 – 85100 Potenza, Italia Innovation Insights Hub University of the Arts London Central Saint Martins Granary Building, 1 Granary Square, London N1C 4AA United Kingdom
Structured Abstract Purpose – The Project Management body of studies has represented a catalyst of research, contributing to the development of effective theories and models, enabling to manage project's results achievement. From another point of view, instead, it appears necessary, and still lacking, an higher attention towards theories and models that support the management of stakeholders' value achievement. Of a fundamental importance in order to support the management of a program/project with the aim of value matching, is evaluation activity. A variety of evaluation approaches and models exist. Existent literature suggests that no one approach is best for all situations. Rather, the best approach varies according to many different factors. This paper collects, classifies and compares several project/program evaluation models, addressing insights on the selection of the proper evaluation model. Design/methodology/approach – Through a literature review several models addressing a program or project evaluation have been selected. They have been analysed in terms of characteristics, approach (qualitative vs. quantitative), field of application, pros and cons. A summative critical analysis compares the models and suggests implementation insights, depending on the evaluation scope and/or project's field.
2839
Originality/value – The research contributes in the defining of a holistic framework of project/program evaluation models, that overcomes existent limitations: overall collection and comparison of evaluation models, critical analysis for an effective selection of the model. Practical implications – Evaluation of results and impacts of programs and projects is even more fundamental in turbulent and high competitive environments. The assessment and selection of the evaluation model is a crucial activity for many different purpose: finance of a project, assessment of a project's efficacy, improvement of a program's performance. Any project's/program's stakeholder should consider evaluation models characteristics, as a rationale for an effective and efficient evaluation process. Keywords – Program, Project, Planning and Evaluation, Evaluation models. Paper type – Research Paper
1
Introduction In the last decades there has been a wide diffusion of Project Management (PM)
approach in all the fields of human life. The scientific community is paying attention to the research and the studies focusing on PM; companies and public administrations works even more by projects applying PM's models and techniques. The PM approach implies planning the achievement of a given objective, on the base of project's activities planning. According to this approach the management by activities secure an effective management of the whole project, with respect of expected time and cost. The PM approach, indeed, recognizes as driver for customers' value creation, the respect of planned time, cost, quality of delivered product/service. It is not comprised, among the management drivers, the stakeholders' value, that is all the benefits that arise in the mid and long term because of the project's results. In today's economic, political, social context the measurement of the value generated in the mid term (effects) and in the long term (impacts) by projects and programs is not circumventable in order to actually boost companies, public administrations, politics, third sector, a whole country. If on one hand PM has represented a catalyst for research, largely contributing to the development of effective theories and models enabling the management of projects, on the other hand an higher attention needs to be addressed to theories and models enabling
2840
the management of the value that spin-off from the project or program. The management perspective changes from PM: the project/program is not the goal, but the mean to impact on stakeholder's value. A step in order to support the management of project's and program's impacts, in the perspective of stakeholder's value creation, is the modelling of relations between project/program and created (or induced) changes on stakeholders in the mid and long term. This paper aims to identify and frame the models that play a role in the measurement and analysis of effects/impacts produced by projects and programs. These models are identified as evaluation models. The terms project and program are used interchangeably. The paper is organized as follows. Section 2 provides the background of the research, section 3 reports the research results, section 4 outlines the conclusions.
2
Program and Project evaluation: definitions and concepts. Notably higher then the past is the variety of projects and programs that the manager
faces today: local development programs, social innovation projects, poverty reduction programs, politic programs, cooperation projects, research projects, are just few of the wide range of today's project typologies. Global dimension, intangible nature of the objectives, wideness of target beneficiaries, are just few of new emerging characteristics of a program. They call to a re-examination of the models of project management, according to new management drivers, like mid and long term impacts, stakeholders' value. Those last critical drivers of a project have been adopted by the European Community under the European planning programs, with the name of Project Cycle Management (PCM) (European Commission, 1993). The management approach focuses on wider effects and impacts of a project or program, and becomes critical for many reasons: the funding, the management, the communication. A different approach to planning, distinguishes the traditional PM from the management of impacts: management of impacts is based on the "planning by objectives", while PM is based on "planning by activities". In the "planning by objectives" it is first identified the ultimate objective of project, that is the impact, then afterward the sub-objectives, that are the effects, necessary preconditions to generate impacts.
2841
Project evaluation, that means exactly measurement of the created value, allows to evaluate the actual impact produced on a project's stakeholders, to assess ex-ante the coherence of a project, to provide feedbacks to managers and policy makers against the quality of management process, to improve/innovate the management. Any framework designed to promote an understanding of evaluation should include a common conception of what evaluation is (Saskatchewan Minister’s Advisory Committee on Evaluation and Monitoring, 1989). Evaluation concepts are understood in very different ways. For this reason it is important to define them so everyone will attach the same meaning to them, or know what the differences are, thereby improving communication. Most definitions refer to program evaluation; some refer to project or policy evaluation. Some definitions use one term to refer to all types of evaluation, for example including policy and program evaluation under the umbrella label of 'policy evaluation'. The concept of evaluation is wide and requires a focus against the context, the field of study. In management and economics evaluation is the assessment and the analysis of the effectiveness and the route of an activity, it involves the formulation of judgements about the impact and progress. "Evaluation is the comparison of the actual effects of a project, against the agreed planned ones. Evaluation looks to what is planned to do, what has been achieved, how it has been achieved" (Shapiro J., 1996; Archibald R., 2012). Many different definitions of Evaluation con be found in the management literature. Evaluation is often as an activity that judges worth. Valutazione è spesso definita come l'attività che giudica il valore. For example, evaluation is "…the determination of merit, worth, or significance…" (Scriven, 2007), or "a course of action used to assess the value or worth of a program" (Farell et al., 2002). Policy evaluation is a family of research methods that are used to systematically investigate the effectiveness of policies, programs, projects and other types of social intervention, with the aim of achieving improvement in the social, economic and everyday conditions of people’s lives (Government Social Research Unit, 2007). This last definition highlights the need of measuring the change produced by a project, and not just limit to the output. The change should be measured in terms o improvement (value creation) or worsening (value destruction). The produced change takes different forms and dimensions, not always measurable and comparable.
2842
Program and project evaluation models are approaches and techniques aimed at evaluate and/or measure one or more dimensions of change, like social, economic, cultural, technological, determined by actions taken in one or more contexts. Most of the definitions of evaluation are linked to concept of improvement. As Kahan e Goodstadt (2005) argue, evaluation is a set of research questions and methods properly articulated to review processes, activities and strategies, with the aim of improve and achieve better results. The strategic importance of evaluation for the project management is given by the need of do it so along the whole project life cycle. So according to this need, it can be effectively divided into ex-ante, interim, ex-post evaluation. The first applies in order to compare, select, finance alternative projects. The second one applies along the project development in order to improve the strategy, or the processes. The third applies to completed projects, in order to take lessons, insights, judgement and awareness about taken decisions and projects. The real purpose of an evaluation is not just to find out what happened, but to use the information to make the project better. Different project evaluation 'process models' can be found in the management literature. Darabi (2002) proposes a systemic model of 5 phases: 1.
program analysis and evaluability assessment
2.
evaluation design
3.
evaluation methodology development
4.
implementation and administration
5.
communication of evaluation findings
The statement that many programs and projects for social and economic development have not determined tangible and durable changes to target of stakeholders, induced the main international organizations for development (United Nations, World Bank, European Union) to give themselves frameworks and tools able to secure higher project efficacy, overall improvement of program management . As the motto 'you cannot manage what you cannot measure' sounds appropriate to stress the importance of the performance measurement and management in companies, the one 'you cannot change what you cannot evaluate' appears equally appropriate to stress the importance of evaluation in the management of project and programs. A fundamental role in performing evaluations play the evaluation models.
2843
3
Program and Project Evaluation Models: a literature review.
3.1 A collection from the literature Many evaluation approaches and models exist. The literature suggests that no one approach is best for all situations. Rather, the best approach varies according to factors such as fit with basic values, the intent of the evaluation, the nature of key stakeholders, and available resources. In addition, it is not necessary to stick strictly to one approach: evaluations “might quite reasonably combine elements of different approaches or adapt to local conditions.� (Rogers and Fraser 2003) . A variety of approaches exists (Kahan, 2008): goal based, goal free, theory based (logic model), utilization, collaborative, balanced scorecard, appreciative inquiry, external, CIPP. The Logical Framework Approach (LFA) is probably the widely used to support planning, evaluation and management of a project (Practical Concepts Incorporated per
US Agency of
International Development, USAID, 1969).
Typology
Peer Review (p.r.)
Case study
Technological forecasting
Financial methods
Framework/model
Source
A. Direct p.r. B. Modified direct p.r. C. Ancillary p.r. D. Traditional p.r. E. Indirect p.r. F. Pre-emptive p.r. A. Prospective c.s. B. Retrospective c.s.
Various authors from scientific literature (most dated source: Royal Society of Edinburgh, 1731)
Qualitative
Le Play F., 1829
QualiQuantitative
Scenario Planning
Kahn H., 1950
Qualitative
Cross-impact matrices (or Inter-dependency matrices)
Gordon T., Hayward H., 1968
Qualiquantitative
Morphological analysis
Zwicky F., 1967
Qualitative
Value-based management literature
Quantitative/ Financial
Internal Rate of Return (IRR), Net Present Value (NPV), Pay Back Period (PBP) A. Binomial Option Pricing Model B. Trinomial Option Pricing Model Cost-benefit/Cost-effectiveness Analysis (CBA)
Economic-based methods
Nature
Cox, Ross and Rubinstein (1979) Economic Literature (most dated source: Dupuit J., 1844)
Quantitative Quantitative/ Financial
Social Accounting Matrix (SAM)
Stone and Brown, 1962
Quantitative
A. Experimental economics B. Data C. Instrumental variables D. Computational methods
Economic literature
Quantitative
2844
E. Structural econometrics
Technological-based methods
A. Technology Assessment B. Technology Dynamics C. Technology Forecasting
Technology management literature
Ex-ante, quantitative
Storytelling
Social sciences
Qualitative
Impact narratives
Social Literature
Qualitative
Ethnographic Evaluation
Social sciences
Qualitative
Analytic Hierarchy Process
Saaty T.L.,' 70s
Quantitative
Narrative methods
Ethnographic methods
Program Assessment Rating Tool (PART)
United States Department of Defense, 1960’s United States Office of Management, 2002
Key Performance Indicators, KPI
Management Literature
Quantitative
Balanced Scorecard
Kaplan and Norton, 1992
Qualiquantitative
Performance Prism
Neely et al., 2002
Qualiquantitative
Main Science and Technology Indicators (MSTI)
OECD (2013) World Bank (undated)
Quantitative
Participatory impact pathways analysis (PIPA)
Challenge Program on Water and Food (CPWF) Catalytic Construction Company, 1957/Booz, Allen & Hamilton, Inc., 1958
Earned Value Analysis/Management (EVA/M) Scoring methods
Scorecard methods
Bibliometric methods
Pathways analysis
CPM/PERT Critical Path Method/Program Evaluation and Review Technique Logical Framework Approach (LFA)
Logic model/ framework
Kellogg’s Logic Model CIPP Evaluation Framework Weaver's Triangle
Quigley M., for W. k. Kellogg Foundation, 1998 D. Stufflebeam et al. 1960 Weaver P., undated
Malcom Baldridge Award/ Model TQM Approach EFQM Excellence Model SWOT Analysis Strategic
Rosenberg L. J., 1969
Malcolm Baldrige National Quality Improvement European Foundation for Quality Management (EFQM) Humphrey A., 1967 Kaplan R.S., Norton D.P., 1996 D. Ronald Daniel (McKinsey & Company), 1961/
Strategy Map Critical Success Factor
2845
Quantitative Quantitative
Qualitative
Quantitative
Qualitative QualiQuantitative Qualitative Qualitative QualiQuantitative Qualiquantitative Qualitative Qualitative Qualitative
J.F. Rockart, 1979
Work Breakdown structure
United States Department of Defense, 1957
Qualitative
Cost Breakdown Structure
Management literature
Quantitative
Problem tree analysis
System analysis literature
Qualitative
Statistical
Six Sigma
Motorola, 1981
QualiQuantitative
Multicriteria analysis
Multicriteria decision analysis (MCDA)
Early 1960s
Quantitative
Breakdown/Tree Structures
Tab. 1 - Summary table of Program/Project Evaluation models literature review.
Approaches vary on the basis of what is evaluated, who participates in the evaluation, evaluation purpose, and how the evaluation is conducted. In practice, approaches are often combined. Despite the large number of evaluation models, their heterogeneity in nature, field of application, phase of application, make the selection of the suited model(s) a strategic decision for the evaluation. In order to decide which model/approach best suites, a literature collection of evaluation models has been carried out. The collection results in 41 frameworks and/or models, some of them have common subjects, context/field of application, evaluation or measure approach. These commonalities allows to group in typologies. Besides they have been labelled by 'source' and 'nature' (qualitative vs. quantitative).
4
Conclusions The collection of models/approaches for program and project evaluation purposes,
represents the first step of a research project aimed to define a wider, prescriptive framework addressing support for the models selection. A suitable characterization should take into account some program's variables like phase and field/context, and some evaluation's variables like purpose. The research contribute also to understand how important is the adoption of evaluation's model, in the management of impacts of programs and projects, in the perspective of stakeholders’ value creation. Particularly important is the selection of the approaches and techniques in order to support ex-ante, ad interim, ex-post program evaluation. Useful implications arises as spin-off from the research, for practitioners such
2846
as managers, policy-makers, public decision makers, that have to perform project and programs in ages of even higher expectations and needs coming from stakeholders.
References Archibald R., 2012, Project management. La gestione di progetti e programmi complessi, Franco Angeli, Milano. Bussi F., 2001, Progettare in partenariato, F. Angeli, Milano. Cantamessa M. et al., 2007, Il project management. Un approccio sistemico alla gestione dei progetti, ISEDI; Darabi, A., 2002, Teaching Program Evaluation: Using a Systems Approach. American Journal of Evaluation, Vol.23, 219-228 European Commission, DG VIII. Project Cycle Management, An Integrated Approach. Brussels. February 1993 Farell, KM, Kratzmann, S, McWilliam, N, Robinson, Saunders, S, Ticknor, J, White, K., 2002, Evaluation made Very easy Accessible, and Logical. Atlantic Centre of Excellence for Women’s Health, online at: www.acewh.dal.ca/eng/reports/EVAL.pdf Gordon, Theodore, Hayward, "Initial Experiments with the Cross-Impact Matrix Method of Forecasting," Futures, Vol. 1, No. 2, 100-116, 1968. Kahan B., 2008, Excerpts from Review of Evaluation Frameworks, Saskatchewan Ministry of Education, Canada Rossi M., 2004, I progetti di sviluppo, F. Angeli, Milano. Shapiro J., 1996, Monitoring and Evaluation, Civicus - online at gobookie.org Scriven M., 2007, Key Evaluation Checklist. Evaluation Checklists Project, University of Michigan, online at: www.wmich.edu/evalctr/checklists
2847