Overview of the delivery approach
Report produced for Bill & Melinda Gates Foundation
14th August 2019- Draft Version 1.4
Robin Todd & Dan Waistell
Report produced for Bill & Melinda Gates Foundation
14th August 2019- Draft Version 1.4
Robin Todd & Dan Waistell
To provide an assessment of the effectiveness of the ‘delivery approach’ in education in Africa and Asia through the development of several case studies which examine the strengths and limitations of various applications of the approach.
•
In recent years there has been growing interest across Governments, Multilateral and Bilateral Development Agencies in looking beyond the formulation of best practice policies and focusing on implementation and ‘getting things done’.
• At the heart of this interest has been a set of ideas and structures which can be termed the ‘delivery approach’. These ideas and structures are intended to bring about a transformative shift in attitudes and behavior towards public service delivery and achieve rapid results at scale by overcoming barriers to effective implementation, usually focused on government services such as health and education.
• The purpose of this assignment is to provide an assessment of the effectiveness of the ‘delivery approach’ in education, primarily in Africa and Asia, through the development of case studies which examine the strengths and limitations of various applications of the approach. This assessment took the form of a desk review of existing literature supplemented by face-to-face, phone interviews and emailed requests for information to individuals involved in the specific cases including government and district officials (including headteachers and teachers) facilitated through our in-country teams.
The first 3 sections of this report look at the ‘delivery approach’ as a whole, drawing lessons from existing research and practical experience. Section 4 lays out the specifics of delivery in certain locations through a series of case studies.
• Section 1 - Definition of the delivery approach, key principles and relationship to system strengthening
• Section 2 - Origins, evolution and variants of the delivery approach
• Section 3 - Impact of the delivery approach and assessment of its strengths, weaknesses and potential application in various contexts
• Section 4 - Country Case Studies
− Tanzania
− Sierra Leone
− Punjab, Pakistan
− Haryana, India
− Ethiopia
− Selected shorter case studies (PMDU UK, PEMANDU Malaysia, Indonesia, Colombia, Ghana)
Attempts to implement policies which improve education service delivery are often beset by challenges. Despite important contextual differences between countries there are a number of common public service delivery challenges which include:
• Lack of clarity as to the practical steps needed to turn national policy commitments into tangible outcomes at an institutional level i.e. within schools or colleges.
• Lack of joined up working at national level- policy priorities falling across or between various Councils, Boards or Agencies with unclear accountability for results.
• National level challenge to ensure quality of delivery when responsibility is devolved to local level. If results are poor in a local area, it is still often the national government which gets the blame for this.
• Focus in government on process and procedures rather than outcomes. This can lead to a limited sense of urgency to make a positive difference within schools.
• A lack of sufficient human and financial resources throughout the education system and a general sense of acceptance that these constraints mean that policy goals may never be achieved.
• Lack of local level understanding of national commitments means that intended results are never realised.
• Lack of understanding at the centre of government and among other stakeholders as to what is needed at an institutional level (school, college, etc.) to deliver high-quality services as well as lack of awareness of the day-to-day constraints faced by front-line professionals in delivering these services.
“I realised that the problems may not necessarily lie in the quality of policy making processes or policies themselves, but on the mechanisms in place for implementation, monitoring and evaluation. I noticed that much of the time we are bogged down by processes and bureaucratic inertia.”
President Jakaya Kikwete of Tanzania, opening remarks at the 12th Forum of Commonwealth Heads of African Public Service, 13th July 2015, Dar es Salaam
• In recent years there has been growing interest across Governments, Multilateral and Bilateral Development Agencies in looking beyond the formulation of best practice policies and focusing on implementation and ‘getting things done’. At the heart of this interest has been a set of ideas which can be termed the ‘delivery approach’.1
• The World Bank under the leadership of President Jim Yong Kim has played a role in advancing thinking on the delivery approach or what it terms the ‘science of delivery’. Dan Hymowitz from the Africa Governance Initiative (AGI) think-tank rightly points out that achieving results through this approach is as much of an ‘art’ as it is a ‘science’, requiring a shrewd understanding of politics and incentives.2
• The delivery approach involves the application of a set of best practice principles initially popularised in the early 2000s by the UK Government’s Prime Minister’s Delivery Unit (PMDU).
• These principles (set out overleaf) have evolved over the past two decades as they have been applied across numerous country and sectoral contexts with varying degrees of success. This evolution means that it has become quite difficult to define precisely what is meant by the ‘delivery approach’ as its application covers a range of strategies and techniques ranging from the very prescriptive, centralised and all-encompassing to the more flexible and localised application of specific tools or techniques.
• In summary the delivery approach consists of a set of tools and techniques which can assist in ‘getting things done’ but the application of these tools and the incentive structures and accountability mechanisms which surround them are absolutely critical. What works in one country, district or region will not necessarily be successful if rigidly applied elsewhere.
Through our involvement and observation of delivery approach processes in various countries we have identified the following key principles which are associated with the successful application of the approach, with consistent and active senior system leadership required across all principles.
Focus on a limited number of key priorities which are clearly understood across the delivery system. Ensure that there is a strong link between priorities and resources so adequate budgets are available.
Develop a clear understanding of tangible outcomes so that key priorities are viewed from the perspective of what is achieved at the level of the individual citizen e.g. in schools rather than what government spends or does at a Ministerial level.
Use regular data as the basis for establishing effective performance management routines. Develop good quality data and metrics to measure what matters. Collect reliable data for a small number of priorities and then ensure that data is analysed and used regularly to inform decision-making.
Engage stakeholders in analysing issues & owning outcomes. Involve front-line workers in analysing problems and developing solutions. Develop understanding of delivery systems to identify the drivers of successful outcomes and the motivations and perceptions of system actors. Develop a support and challenge function at national and local levels.
Develop a communications strategy to assist in rapidly engendering change and reform to reverse a perceived decline or deficit in standards of service delivery.
Ensure accountability for performance throughout the delivery system.
Strike the right balance between planning and delivery, recognising which areas can achieve rapid results and which may take a longer time.
To send a message throughout the delivery system that priorities are important , that progress and issues will be monitored and acted on regularly and that people at all levels of the delivery system will be held accountable for progress and results.
• None of the principles set out on the previous page are revelatory, complex or exceptional. These are common sense things which every Government should be looking to do in one form or another.
• In many countries however these principles are not being applied effectively or consistently. Thus, when looking to apply the delivery approach, countries should start from an assessment of their existing strengths and weaknesses. Building on existing strengths rather than focusing predominantly on weaknesses is an important part of the approach. It is also critical that any priorities, processes and structures are genuinely country-owned rather than imposed from outside.
• The aim of the delivery approach should be to strengthen existing systems so that they are better able to deliver citizen-centred outcomes rather than bypassing systems and establishing new delivery mechanisms- this may drive short-term results but runs the risk of making systems weaker over the medium and longer-term.
The delivery approach is not something completely new and actors in any education system are likely to have been applying some of the key principles in aspects of their work. What is different about the delivery approach is how the four principles come together in a coordinated, catalytic manner to address a specific problem or issue, focusing ‘like a laser’ until performance has improved. This ‘flow diagram’ of processes required to enable this to happen is set out opposite.
We have turned the delivery approach principles set out on Slide 4 into a set of hypotheses and statements on critical success factors. These are set out below and will be used as the framework to examine country case studies.
Critical Success
Factor 1: Senior system leadership & commitment
The delivery approach will not work unless there is a genuine desire from system leaders to achieve results and a willingness to devote significant time to ensuring that accountability flows through the delivery system. System leaders will usually be senior politicians but they can also be prominent public servants and will have genuine authority which is respected at local level.
Critical Success
Factor 2: Prioritization
The initial prioritisation of issues is not easy for system leaders and politicians who often have to deal with many priorities. Genuine prioritisation means accepting trade-offs, focusing on success in one area and, by implication, de-prioritising other potentially important areas.
Critical Success
Factor 3: Data and routines
Critical Success
Factor 4: Understanding and analysis
Regular and reliable performance data are used to monitor progress against milestones and targets. Data informs action plans which are then adapted, targeting support where needed. Feedback mechanisms are in place to ensure that local level actors are using data effectively.
Priorities and targets are understood across the delivery system from the national through to local levels. Systems actors are clear of the role they must play to achieve targets. Plans and targets are based on a solid understanding of issues and what is required to achieve change.
Critical Success
Factor 5:
Accountability
Accountability structures personally chaired by the system leader meet regularly to examine performance data and progress against plans. These structures have a mandate to discuss issues and take action to resolve blockages and constraints to achieve results.
Literature on the delivery approach has often equated the adoption of the approach with the establishment of Centre of Government Delivery Units (at either Presidential, Prime Ministerial or Ministerial level). Recent reports have charted the rise and fall of Delivery Units worldwide but it is important not to see the establishment of a Delivery Unit as synonymous with the application of the delivery approach. Countries can still apply the principles of the delivery approach without necessarily establishing a Delivery Unit and there may be instances where it is advantageous to not establish a separate Unit. They can also establish a Delivery Unit without necessarily following the principles of the delivery approach 3, 4. It may therefore be inaccurate to point to the abolition of a Delivery Unit as evidence that the delivery approach has ‘failed’ in a particular country context.
Do I need to establish a Delivery Unit in order to apply the delivery approach? A series of questions for governments
Does the Government have a clear set of delivery priorities?
Is the Centre of Government using data to regularly analyse performance and take action to improve results?
Do officials at the centre of Government understand what is required to deliver positive results in schools? Do they have sufficient understanding of what motivates teachers?
If YES then a Delivery Unit may not be necessary.
If YES then a Delivery Unit may not be necessary.
If YES then a Delivery Unit may not be necessary.
Does the centre of Government have an existing and effective mechanism for monitoring and performance managing results?
If YES then a Delivery Unit may not be necessary.
Even if you answer No to the questions above a Delivery Unit may not be the most suitable approach depending on the country context. While Delivery Units can be useful in kick-starting an underperforming system and ensuring the effective application of the delivery approach they can also have several drawbacks including:
• Cost: new structures cost money and it is often easier to create new things rather than fix problems in existing systems.
• Recruitment Issues: getting recruitment right is important and not easy. Sometimes too many senior staff are employed or staff are employed who are ‘stuck’ in the existing ways of thinking. Recruiting outsiders is not always the answer either as they may not understand how government functions or they may be resented by the civil service, particularly if they are on higher salaries.
• Establishment of Parallel Structures: actually weakening delivery systems by establishing separate data collection or analysis systems which remove responsibility from mainstream system actors or by seen to be ‘responsible for delivery’ rather than system strengthening.
• Conflict with the Delivery System: Actors in the delivery system refuse to cooperate with the Delivery Unit and seeking to sabotage it by refusing to share information or sharing deliberately inaccurate data.
The delivery approach was initially applied in education systems which had a relatively high level of capacity. There were specific delivery issues which required addressing in the UK, Malaysia etc. but in these systems there were i.) skilled people; ii.) data systems which could be used to monitor and track progress and iii.) the existence of functional delivery systems with relatively well defined roles and responsibilities from national through to school level.
In some of the country contexts in which the delivery approach has been applied these basic enablers do not exist (Sierra Leone for example where, in 2015, District Education Offices had only a handful of staff and no meaningful way of monitoring the hundreds of schools under their jurisidiction).
In these instances the answers to the four questions on the previous slide about establishing a Delivery Unit would suggest that a Unit is the answer. But care should be taken in coming to this conclusion. A Delivery Unit in a low capacity system may end up substituting for the system, particularly if dedicated structures are created through to local level to deal with capacity constraints (this occurred in Sierra Leone with the creation of District Delivery Teams who reported to the President’s Delivery Team).
Such an approach may have merit in enabling the effective project management and delivery of basic results in priority areas but, if the focus is predominantly on delivering the ‘outputs’ in the diagram above and not considering how to establish and sustain the ‘enablers’ in the existing delivery system then there is a danger that, once the dedicated support around the delivery approach is withdrawn, then the system may actually be weaker than before the intervention started.
This suggests that careful thought is needed when considering how to apply the Delivery Approach in systems where these ‘enablers’ do not exist. It is important to consider how the delivery approach can achieve short-term results whilst also strengthening the existing system by developing these enablers.
Whilst the main motivation for the application of the delivery approach in many instances is to achieve short-term results and ‘get things done’ there is an implicit assumption that, by achieving such results, delivery systems will also be strengthened. The system strengthening diagram introduced on the previous slide illustrates the main components required to achieve improved educational outcomes. It consists of a set of 4 outputs required to improve learning outcomes and these outputs are supported by the existence of 5 enablers. In the diagram below we have just presented the ‘outputs’ section of this diagram to illustrate how the four outputs of the system strengthening model are very closely aligned with the four delivery approach principles.
At an output level the delivery approach provides all that is needed within this model to achieve improved learning outcomes- it provides support and capability to analyse and unblock problems; introduces effective management routines linked to timebound implementation plans with clear metrics and ensures that these plans are focused on an agreed set of genuine priorities. If we just look at outputs of the system strengthening model then the delivery approach appears to be a very relevant set of tools and techniques to enable countries to improve educational outcomes. However we must remember that achievement of the four system strengthening outputs is reliant on the existence of five enablers- good data, strong relationships and a culture of collaboration, skilled people, clear roles and responsibilities and clear accountabilities. The relationship between the delivery approach and these five enablers will be discussed on the next slide.
Achievement of the outputs in the system strengthening model is reliant on the existence of five enablers . As the diagram below illustrates there is a certain degree of overlap and correlation between these enablers and 3 of the 4 delivery approach principles. This illustrates that effective application of the delivery approach is reliant on existing conditions within the delivery system. If, for example, good data does not exist then this will need to be produced. In systems where these enablers are not already in place there is a temptation, in order to achieve short-term results, to establish parallel or temporary data collection, roles & responsibilities and accountability systems. These may help to achieve short-term results but will not strengthen systems. Ultimately therefore the delivery approach by itself is insufficient to bring about sustained system strengthening although it can play a contributory role.
It is also noteworthy that one of the key enablers - skilled people - has not generally been a key priority when applying the delivery approach. Yes there has generally been a focus on building skills and capacity at a national level amongst a small core team of staff in the Delivery Unit or similar structure but this has rarely extended down to district or school level. Instead the focus has been on achieving quick results, measuring progress and holding people to account for their performance. Likewise, in some applications of the delivery approach, there has been a much stronger focus on accountability and responsibilities than there has on collaboration and relationship building. Collaboration and cross-departmental working has played an important role in the UK’s PMDU, Malaysia’s PEMANDU and Sierra Leone’s PRP although there is less evidence of this from some of the other case studies in this report.
The 5 ‘enablers’ from the system strengthening diagram discussed in the previous slides appear to be a necessary pre-condition for sustainable system change.
In our country case studies we will examine the extent to which the delivery approach succeeded in bringing about sustained improvements in data collection, analysis and use; established clear roles and responsibilities for delivery; and strengthened and clarified accountability for achieving results. All three of these ‘enablers’ are aligned with the principles of the delivery approach(as the diagram above illustrates) and should be an explicit part of the package of activities and support introduced alongside the approach.
We will also examine the extent to which:
i.) strong relationships and a culture of collaboration (which has been a key principle in the application of some variants of the delivery approach e.g. PMDU 2007-2010 with its focus on cross-departmental Public Service Agreement (PSA) boards) and
ii.) skilled people existed as a possible pre-condition for success (or failure) within each country’s education delivery system.
The delivery approach relies on the premise that a small team of people working in the centre of the education delivery system can introduce prioritized performance management routines backed up by good data which will in turn improve learning outcomes for potentially millions of children. In order for this premise to hold there must be i.) a sufficiently strong and collaborative system to enable instructions and incentives from the centre to reach sub-national and school levels and ii.) a sufficiently skilled workforce to respond effectively to these instructions and incentives so that outcomes and results improve. As mentioned previously, if such enablers are not in place there is a temptation to establish ‘quick fix’ parallel data and accountability systems or roles and responsibilities in order to achieve improvements against a set of activities and outputs. Such an approach is not likely to contribute to genuine system strengthening over the longer-term.
This diagram displays a number of countries which are currently attempting to apply the delivery approach to improve service delivery with a specific focus on education.4
UK, Implementation Unit, 2012
Albania, Delivery Unit, 2013
Romania, Delivery Unit, 2014
Serbia, Delivery Unit, 2015
Jordan, Prime Minister’s Delivery Unit, 2015
Oman, Tanfeedh Delivery Unit, 2016
Saudi Arabia, Central Delivery Unit, 2016
Costa Rica, Delivery Unit, 2015
Guatemala, Delivery Unit, 2016
Peru, Delivery Unit, 2016
Canada, Results & Delivery Unit, 2016
Maryland (USA), Governor’s Office of Performance Improvement
St Lucia, Delivery Labs, 2018
Western Cape, SA, Delivery Support Unit, 2014
South Africa, Operation Phakisa, 2014
Uganda, PMDU, 2016
Ethiopia, Education Delivery Unit, 2017
Guinea, PMDU, 2017
Egypt, Ministry of Education, 2017
Ghana, Education Reform Secretariat, 2018
Punjab, Pakistan, Special Monitoring Unit, 2014
Haryana, India, Department of School Education, 2014
PENGGERAK, Brunei, 2014
New South Wales, Australia, Premier’s Implementation Unit, 2016
A number of Delivery Units and related structures have closed in the past decade, often due to changes in political administration. In some countries similar Units have then been established again by new administrations.
UK, PMDU, 2001-2010
Netherlands, 2006-2010
Wales, 2011-2016
Middle East
Jordan, Prime Minister’s Delivery Unit, 20102013
Indonesia, UKP4, 2009-2015
Malaysia, PEMANDU 2009-2018
Mongolia, 2013-2015
Queensland, Australia, 2004-2007
Australia (Federal) 2003-2015
Chile, 2010-2014
Colombia, Delivery Unit, 2014-2018
Tanzania, President’s Delivery Bureau, 2013-2017
Sierra Leone, President’s Delivery Team, 2015-2017
The widespread establishment of Delivery Units by Governments can be traced back to the formation of the UK Prime Minister’s Delivery Unit (PMDU) by Tony Blair in 2001 under the leadership of Sir Michael Barber. The PMDU was tasked with ensuring that the Prime Minister’s domestic policy priorities were implemented effectively so that they achieved tangible performance improvements and significant results on the ground. The PMDU focused on a relatively small number of key outcomes which were a real priority for the Prime Minister and his Government. Located right at the centre of Government (initially in the Cabinet Office and then the Treasury) with direct access to the Prime Minister the PMDU was kept relatively small, with fewer than 50 staff, and attracted a blend of top talent from the civil service, private sector and frontline service delivery positions in local government.
PMDU was considered to be a successful innovation (particularly after the publication of Sir Michael Barber’s ‘Instruction to Deliver’ in 2008) and it attracted considerable attention from governments worldwide who came to the UK to learn how it operated and see how a similar approach could be applied in their own contexts.5
It should be noted that the PMDU’s remit and mode of operation, whilst remaining fundamentally the same, did alter and flex over the course of its lifespan (2001-2010). This evolution is important as it has influenced the subsequent application and evolution of the Delivery Approach globally over the past two decades. Some of this evolution is helpfully set out in the Institute for Government’s ‘Public Service Agreements (PSAs) and the Prime Minister’s Delivery Unit’ report.6
Head: Sir Michael Barber
Core Functions & Characteristics
- Regular contact with Prime Minister
- Focus on small number of key priorities in education, health, transport & home affairs.
- Strong emphasis on centralised quantitative target setting and performance management against trajectories through routines and publication of departmental ‘league tables’ against 17 priority PSAs.
- Conducting frontline deep-dive ‘priority reviews’ to understand delivery issues.
Head: Sir Ian Watmore
Core Functions & Characteristics
- Less direct contact with PM due to domestic political issues.
- Broader focus and introduction of Capability Reviews intended to assess Departmental performance and build delivery capacity across all Government Departments.
Head: Ray Shostak
Core Functions & Characteristics
- Relocation from Cabinet Office to Treasury.
- Overseeing 30 PSAs which emphasised cross-government working and collaboration.
- Unblocking delivery obstacles through deepdive ‘priority reviews’, problem solving and follow-up brokerage work with departments (consumed most of PMDU’s time).
- Broadening of scope and shift of emphasis onto collaborative problem solving and away from centralised target setting.
PMDU 2001-2005 (Blair) PMDU 2005-2007 (Blair) PMDU 2007-2010 (Brown) CBEThe PMDU’s 2001 to 2005 structure and mode of operations became the template which many governments looked to follow when developing their Delivery Units. This process was accelerated by Sir Michael Barber’s prominent role in promoting Delivery Units, initially with McKinsey and then through Delivery Associates. As governments looked to adapt the original PMDU principles (which Barber termed ‘Deliverology’) variants of the approach inevitably emerged. One distinctive Delivery Unit (DU) variant was developed in Malaysia as the Performance Management and Delivery Unit (PEMANDU) in 2009. PEMANDU’s approach (initially supported by technical advice from McKinsey) was based heavily on private sector operating practices introduced by PEMANDU’s CEO, Idris Jala who was a former head of Malaysia Airlines. Idris Jala introduced a methodology called Big Fast Results - 8 Steps of Transformation. A key feature of this approach was the operation of large-scale ‘Delivery Labs’ which lasted for several weeks enabling prominent stakeholders to develop detailed implementation plans (3 Feet Plans) to achieve Key Performance Indicators (KPIs). PEMANDU’s Delivery Labs and the resulting Plans were publicized through media engagement and Open Days. PEMANDU itself was a large organization, employing over 130 people from both the public and private sectors and it operated like a private sector company, publishing an Annual Report setting out progress against the Government Transformation Programme (GTP) and Economic Transformation Programme (ETP). PEMANDU ceased operating as a government entity in 2018 and became a private consulting firm ‘PEMANDU Associates’ which now works with countries such as St Lucia and Nigeria.
PEMANDU has played an important role in publicizing their particular Delivery Unit approach internationally, most notably through the experience of Tanzania where President Kikwete adopted the PEMANDU model wholesale in 2013, terming it ‘Big
Now!’. The Tanzanian model was headed by the President’s Delivery Bureau (PDB) and the former head of the PDB, Omari Issa, has in turn played an important role in promoting the ‘delivery labs’ model through the Education Commission, influencing practice in countries such as Ethiopia and Uganda.
Another highly influential application of the Delivery Approach was that adopted in Punjab, Pakistan spearheaded by Sir Michael Barber (initially as part of McKinsey and subsequently on an independent basis). The Punjab Education Roadmap drew heavily on Deliverology and focused intensively on monitoring a small number of priorities using the considerable authority of the Chief Minister to ensure progress, delivering impressive results.
Elsewhere Management Consultancy companies such as Dalberg (in Guinea) and McKinsey (in Sierra Leone, drawing upon their experience in Punjab) have promoted variants of the approach along management consultancy lines. Some countries have also been influenced by the more nuanced approach taken by the UK’s PMDU between 2007 and 2010 in focusing on collaborative system strengthening rather than target setting. In Ministries of Education in e.g. Uganda and Haryana, India these ideas have coalesced with concepts such as Problem Driven Iterative Adaption (PDIA) capacity building and problem-solving to strengthen systems from local to national, focusing on the application of principles rather than establishment of structures.
Over time these various influences and experiences have coalesced as new countries have drawn upon technical advice from various experts and organisations. For example the Ethiopian Delivery Unit was established with technical advice from Delivery Associates (drawing on Deliverology and the UK PMDU 2001-2005 experience) but the Ethiopian Government was also insistent that this approach should incorporate Labs ( drawing upon Tanzania’s experience in 2013 which was in turn influenced and facilitated by PEMANDU). It is therefore becoming increasingly difficult to trace a single line of influence from one application of the delivery approach to another.
As previously explained many of the examples set out below have been influenced by a combination of the ‘origin’ models (with the UK PMDU 2001-2005 experience being particularly influential) so the table below simply serves to illustrate some of the main connections between different country approaches. Some countries e.g. Nigeria, Ghana and Uganda have already experienced different applications of various models by sector, donor and administration.
UK PMDU 2001-2005 & Delivery Associates (Deliverology)
PEMANDU, Malaysia
2009- 2017
PEMANDU Associates 2018 to present
Management Consultancy Variants
McKinsey, Deloitte, Dalberg, AGI
UK PMDU 2007-2010
Global evolution (BCG) and incorporation of PDIA
• Personal engagement of senior system leader.
• Small number of key priorities.
• Focus on centralised target setting & regular performance monitoring routines.
• Focus on local problem solving.
• Limited external publicity.
• Tools and techniques codified as ‘Deliverology’.
• Large structure with over 100 staff.
• Led by private sector CEO.
• ‘Delivery Labs’ large, intensive prolonged problemsolving sessions with key stakeholders
• Strong focus on publicity and communications including the publication of Annual reports and holding of Open Days
• Influenced by McKinsey’s involvement in Punjab and PEMANDU establishment.
• Packaged the steps of ‘delivery’ in an accessible management consultancy format.
• Relies on pace of implementation and rigour of analysis.
• Dependent on international technical assistance, often shortterm.
• Initial focus on performance framework e.g. PSAs.
• Focus on deep dive problem solving reviews.
• Consideration of principles ahead of structures- use of existing systems.
• Local experimentation
• Managing for results and building capacity within delivery systems from districts and schools upwards.
• Punjab Roadmap (Education)
• Western Cape (SA)
• Egypt (Education)
• Ethiopia
• New South Wales (Australia)
• BRN! in Tanzania
• South Africa (Operation Phakisa)
• Oman
• Andhra Pradesh (India)
• St Lucia
• Sierra Leone
• Rwanda
• Nigeria
• Guinea
• Uganda (Ministry of Education)
• Ghana (Education Reform Secretariat)
• Haryana (India)
• Colombia
The diagram below sets out the various stages of the delivery approach with examples of approximate costings and timings depending on the model used.
Step 1
Initial Preparation and Prioritisation
Preparation is the most critical phase of all as many issues need to be thought through in detail prior to implementation of Step 2.
Countries must carefully consider their current situation and whether they need to establish new structures to implement the delivery approach. There must be extensive stakeholder engagement and assessment of institutional readiness.
Prioritisation may be relatively simple if the Minister or senior officials have clear priorities from a manifesto or within an Education Sector Plan. At the other end of the cost spectrum the Delivery Lab (set out separately here as Step 2) can be used for prioritisation.
The main cost driver is the extent to which external consultants are required to conduct preparatory analysis.
Step 2
Diagnostic Fieldwork or Delivery Labs
Once priorities have been identified diagnostic fieldwork is required to understand the issues and engage front-line workers in problem solving. This can be done at a relatively low-cost through targeted stakeholder engagement and fieldwork or at much greater cost if a large-scale, public Delivery Lab is held.
A 4 week Deep Dive exercise in Ghana involving stakeholder workshops and Ministry staff conducting fieldwork to selected Districts and schools cost $50,000 including all staff inputs.
The cost of holding a 6 week Delivery Lab using the PEMANDU model was estimated by DFID to be $350,000 in Tanzania excluding staff and consultant’s time. The total cost of a 6 week Delivery Lab estimated in the Education Commission’s pioneer country work was $2,000,000.
Step 3
Establishment of structures and communications
The establishment of structures can be done concurrently with the preparation and prioritisation phase so that people are recruited and trained . Delays in establishing structures and finalising staffing can result in a critical loss of momentum following Step 2.
The cost of this step can vary considerably depending upon i.) the size of the structure, ii.) the staffing of the structure (private sector and external recruits will cost more than civil servants already on the payroll) iii.) the extent to which the structure is dependent upon international expertise and iv.) the extent and reach of the communications activities publicising the structure and Delivery Approach plans.
The cost of establishment can increase significantly if Units are created at regional and/or district level. From experience such Units may undermine the delivery system and create parallel structures.
Step 4
Operation of data and performance routines
Data gathering and performance management routines are critical to effective functioning of the delivery approach. Costs can vary considerably from being minimal if a decision is made to use existing information gathering systems e.g. district-level circuit supervisors or ward education coordinators to being quite expensive if new methods of data collection are introduced (either using tablets or other electronic means or by recruiting dedicated external monitors to gather school level data).
Performance management routines involve holding regular meetings chaired by the senior system leader and then providing feedback to relevant actors throughout the system. If there is commitment from system leaders and others then this does not need to be an expensive element of the delivery approach.
Step 5
Implementation of priority initiatives
If initiatives are already funded through existing government budgets then this stage will have no cost (other than the ongoing operational costs of structures and data routines listed in Steps 3 & 4).
If dedicated funding is required to implement the Delivery Plans developed during Steps 1 and 2 then the cost could be very significant indeed. Payment By Results funding support and incentivisation programmes such as those operated by DFID, World Bank and SIDA can also run into the $100s of millions.
Should implementation of the delivery approach be time-bound?
There are advantages to seeing it as a short-term (3 to 5 year) catalyst to system reform although improving learning outcomes can take much longer. Even the most notable Units have had a finite lifespan e.g. (UK PMDU 20012010), PEMANDU 2009-2018).
TIMING: 3 to 6 months
COST: minimal to $millions
TIMING: 4 to 8 weeks
COST: $50,000- $2,000,000
TIMING: 1 to 4 months
COST: $200,000-$5,000,000 annually
TIMING: 2 to 6 months
COST: minimal- $5,000,000 annually
TIMING: ongoing for 3 to 5 years
COST: $0 - $100s of millions
Whilst some applications of the delivery approach can be ‘linear’ with little considered analysis or problem solving after plans are developed (as with the initial application of the PEMANDU’s ‘3 Feet Plan’ approach in Tanzania) to truly apply the principles of the delivery approach it is important to take an iterative perspective and be prepared to adapt plans and approaches based on ongoing analysis and problem solving from within the system. Ideally what is required is a ‘strong’ centre of government which clearly articulates priorities and targets across the system which local level actors (within Districts or schools)then have sufficient autonomy to think creatively about how best to achieve within their own context. This may well require capacity building, training and support below sub-national level- something which has not been considered in some applications of the delivery approach.
Step
Initial Preparation and Prioritisation
Step
Step
Operation of data and performance routines
Step
Implementation of priority initiatives
Regular reviews – to connect system levels and enable decision makers to unblock larger problems
Initial diagnostics, priorities and plans
Real-time management results and analysis
Leading to
Problem analysis, solutions and adaption of plans
When PMDU was established in the UK in 2001 it was designed to be relatively low cost and add value by unblocking delivery obstacles within existing government programmes. It had an operating budget which included the salaries of approximately 40 staff (a mixture of civil servants and secondees from management consultancies and local government) and a limited budget for administration and fieldwork.
Subsequent variants of the delivery approach have varied significantly in cost with the PEMANDU model (requiring over 100 staff and the conduct of Delivery Labs which can cost over £1 million per time) being the most expensive. Much of the cost of establishing the delivery approach in many countries has come from the management and technical assistance fees charged by international ‘experts’ and consulting companies. DFID committed £39 million to the establishment and operation of the Tanzanian Big Results Now! Programme with the bulk of this going to PEMANDU’s fees, staffing costs of the large Presidential Delivery Bureau (PDB) and Ministerial Delivery Unit (MDU) structures and conduct of Delivery Labs.
One of the key principles in the UK PMDU was that money would be ‘off the table’ when it came to finding implementation solutions i.e. that the system would need to find ways of achieving results with the same amount of money. This was based on the belief it is always possible for systems to improve and that, if money was ‘on the table’ it would remove the impetus to improve the system as more funds would simply substitute for systems shortfalls and inefficiencies. Departments and Agencies at all levels of the delivery system therefore worked within the constraint that if they wanted more resources for a specific initiative then they had to reallocate it from elsewhere in their operational budget. All delivery plans were therefore fully costed and funded from Department’s own budgets.
In a developing country context funding constraints are often more acute. This has meant that plans and priorities drawn up through Delivery Labs and other prioritisation exercises in countries such as Tanzania and Sierra Leone have had significant funding shortfalls. Rather than working within the existing resource envelope the application of the delivery approach in these contexts has become more ‘projectized’ as development partners have had to step in with dedicated funds to fill these funding gaps. This has a significant disadvantage in that the delivery system then views the delivery approach as a discrete donor project (often perceived as driven and led by donors not government) rather than as a mechanism for system strengthening and increasing accountability. Two ways to potentially overcome this issue are:
i.) to make use of existing donor funded programmes and coordinate and direct them to help achieve results required under the delivery approach.
ii.) to establish a Payment By Results (PBR) mechanism with agreed Disbursement Linked Indicators (DLIs) with funds then dispersed to government once they have themselves achieved agreed milestones towards the expected results required under the delivery approach. This approach can be effective particularly if the PBR arrangements encompass local government units which receive resources when they achieve results.
We looked in detail at 5 country case studies, conducting interviews and analyzing data to assess the impact of the delivery approach and the extent to which the Critical Success Factors from Slide 8 were followed.
Tanzania
Assessing the effectiveness of the Big Results Now! (BRN) education initiative and subsequent Programme for Results (P4R) from 2013 to 2019.
Sierra Leone
Assessing the effectiveness of the President’s Recovery Priorities (PRP) in Education from 2015 to 2017.
Punjab, Pakistan
Assessing the effectiveness of the Punjab Education Roadmap from 2011 to 2019.
Haryana, India
Assessing the effectiveness of the Quality Improvement Programme and Saksham Haryana from 2014 to 2019.
Ethiopia
Assessing the effectiveness of the Education Delivery Unit from 2017 to 2019.
We also provided briefer descriptions of some other notable applications of the delivery approach: the UK’s Prime Minister’s Delivery Unit (PMDU) – the original model which many governments sought to emulate; Malaysia’s PEMANDU- an influential and distinct approach; Colombia’s Presidential Delivery Unit- an example of the application of the delivery approach from South America; Indonesia’s UKP4 – an example of a Presidential-level Unit which became overloaded with multiple priorities; and Ghana’s Education Reform Secretariat- a very recent example of a nuanced application of the Delivery Approach.
Section 3 summarises the main findings from this multi-country analysis.
Section 4 provides more detailed findings from each of the country case studies.
This illustrative graph shows that the quality of enablers (from Slide 13) can be a barrier or catalyst to the effectiveness of the delivery approach in bringing meaningful system change. The relationship between enablers and the systemic impact of delivery approaches
• Some of the most high profile examples of delivery come from systems where the enablers were already strong. This allowed the delivery approach to act like a laser beam to resolve key issues in focal areas, building on a sound foundation.
The impact of delivery approaches in systems with high quality enablers has generally been focused on specific areas of the education system (‘laser beams’ to resolve particular problems) rather than bringing about holistic system change as many parts of the system are already functioning effectively.
Quality of enablers
• In systems where these enablers are not in place, delivery approaches have been less successful in delivering meaningful system change as they have to build short-term fixes to boost the enablers or bypass them entirely with parallel systems.
• While the delivery approaches that have been built in systems with weak enablers have still had areas of short term impact in achieving results (e.g. Sierra Leone), they have been less successful at developing lasting systemic change.
The four by four matrix below classifies educational outputs and outcomes by a combination of their technical complexity and political difficulty.7 From the evidence considered in the country case studies we believe that the delivery approach can be most effective in achieving results in the two left hand quadrants of this model where the linkages between activities and outputs are clearly understood by actors within the delivery system i.e. those which have ‘low’ technical complexity and lend themselves to a target-based performance monitoring regime.
Addressing these ‘left-hand’ issues is important as, if left unresolved, such issues often form a barrier to the achievement of the more technically complex issues in the right hand quadrants. The delivery approach can be particularly useful in tackling issues in the top left-hand quadrant where support and commitment from a system leader can resolve problems which have been traditionally hard to unlock.
The case studies and the literature in this field suggest that success is much harder to achieve in areas where the logical link between activities, outputs and outcomes is less clear (and harder to measure- there is a danger with the delivery approach that the focus shifts to the easily measurable rather than the genuinely important). As an example, international evidence suggests that teachers are often not clear as to the logical linkage between specific behaviours and improved learning outcomes.20 Putting great stress on achieving outcomes can thus lead to demotivation if the person or institution being incentivised or held to account doesn’t feel it is within their power to improve. An example of this would be a head teacher who is held to account for poor exam results when he or she doesn’t have the authority to hire, effectively discipline or dismiss their teaching staff. In such a situation effective incentives would be those which are linked to specific behaviours which teachers, head teachers and officials have the capability to achieve.
The case studies show that the delivery approach can have considerable success in areas such as improving pupil and teacher attendance and increasing capitation grant flows but that it appears to have been less effective in addressing complex system issues such as reforming teacher performance and career development and improving learning outcomes where i.) strong enablers are required to achieve success and ii.) where the technical linkage between activities and outputs is more context specific and less clear.
Outputs and Outcomes classified by Technical Complexity and Political Difficulty
Teacher pay & conditions
Hard
Political Difficulty
Teacher absenteeism
Improved vocational training system
Initial Teacher Training (ITT)
Easy
In-service
Teacher Training (INSET) Regular release of
school capitation grants
Pupil attendance
Teacher performance & career development
Improved learning outcomes
Simple Complex
Technical Complexity
Evidence from the country case studies suggests that the delivery approach may be effective in addressing technically simple issues which can be easily measured where there is a clear link between activities & outputs/outcomes.
Despite some stories of improvements and increases there is limited evidence from the country case studies that the delivery approach has led to a widespread, genuine and sustained improvement in learning outcomes.
• In Tanzania there was a significant improvement in primary and secondary examination pass rates between 2012 and 2015 (the period in which BRN! was operational). However there were a number of factors – including i.) the change in examination methods which made it difficult to legitimately compare results year on year and ii.) the reduction in the number of pupils sitting examinations- which complicate the initial picture of significant improvement. RTI’s national Early Grade Reading Assessment (EGRA) results do show that there was a significant and verifiable reduction in the proportion of ‘non-readers’ in Kiswahili in Grade 3 from 27.7% in 2013 to 16.1% in 2016.8 These improvements were not sustained however and the latest EGRA information suggests that 2018 rates have reduced to levels comparable with 2013 probably due to the significant increase in primary enrolment & class size occasioned by the introduction of fee free education in early 2016.
• In Sierra Leone the President’s Recovery Plan (PRP) did not have a specific target for improving learning outcomes or increasing examination pass rates although this was the general aspiration of the plan. Instead the PRP was focused on putting in place the basic ‘building blocks’ of the education system (producing standard lesson plans, verifying the teacher payroll, constructing classrooms etc.). There were some improvements in examination pass rates in 2016 and 2017 but these rates fell in 2018.
• In Punjab learning outcomes were not an initial focus for the Education Roadmap but they then became a focus from 2014 onwards. Six monthly learning assessment carried out by the Roadmap team suggest significant improvements in Grade 3 literacy and numeracy from 2016 onwards. These findings have been widely questioned however, including by former members of the Roadmap team, who noted that the reductive focus on 15 learning outcomes which were tested monthly meant that teachers taught to these tests and not the curriculum. 9,10,11
• In Haryana there have been improvements in literacy and numeracy since the introduction of the Quality Improvement Programme in 2014. India’s annual ASER report compares literacy and numeracy in Standard V government schools across all States. The latest (2018) ASER report shows that Haryana has seen improvements in literacy and numeracy between 2014 and 2018 but that these improvements have not been exceptional. On Standard V reading Haryana was ranked 4th out of 18 states in 2014 and had fallen to 5th in 2018 and ranked 11th in terms of percentile improvement during this period (20142018) out of the 18 states. On Standard V division Haryana was ranked 5th in 2014 and 3rd in 2018, and ranked 12th in terms of percentile improvement (2014-2018). ASER data therefore suggests that Haryana has not outperformed other States which did not adopt the delivery approach.12
• In Ethiopia it is too early to measure the impact of the delivery approach on learning outcomes.
Looking beyond these developing country case studies there is evidence from the UK that the delivery approach methods used in the London Challenge did lead to a genuine and sustained improvement in results in London primary and secondary schools. In 2003 these schools were below the national average but by 2010 they were ahead of the national average and this situation still persists. It is important to note that this is a sub-national change and that efforts to replicate the methodology in other areas of England were not as successful. In Malaysia PEMANDU and the Ministry of Education claim that Literacy and Numeracy Screening (LINUS) led to significant improvements in learning outcomes since 2009. A 2018 World Bank report however notes that there is insufficient evidence to state whether LINUS has actually improved reading and numeracy skills of early graders due to the lack of key data. 13
Delivery approaches can bring about rapid change by focusing on performance improvements of measurable issues. While this can be a force for good, applying pressure to achieve rapid, measurable change can also bring about some negative unintended consequences where the cause-effect relationship between activities and targets is more complex.
BRN!, Tanzania – Pupils being excluded from examinations
• Recent research funded by the Research on Improving Systems of Education (RISE) Programme came to two contrasting conclusions about BRN! and learning as it relates to the School Ranking initiative (where primary schools were publicly ranked according to PSLE pass rate):
• BRN School Ranking improved learning outcomes for schools in the bottom two deciles of their districts.
• The School Ranking also led some of the poorest performing schools to strategically exclude students from the terminal year of primary school.
Punjab, Pakistan – Increases in exam manipulation and a narrowing of teaching to focus on what was being measured rather than what was in the curriculum
• Interviews indicated that when the performance management target focus was on exam results in 2013/14 this increased cheating and fake results.
• When the focus switched to monthly competency-based assessments, where only 15 or so numeracy and literacy competencies were measured, teachers learnt to game the system but teaching to those test and ignoring the majority of the curriculum. Pupil performance increased based on the test data, but it is more debatable whether learning had systemically and sustainably improved.
Punjab Education Roadmap
• Significant increase in student enrolment.
• Reduction in teacher absenteeism.
• Increased supply of textbooks and educational materials.
• Increase in the number of schools with basic facilities.
Haryana, Quality Improvement Programme/Saksham Haryana
• Increase in number of teachers undergoing structured in-service training.
• Improvement in distribution of teaching and learning materials.
• More regular school inspections and improvements in operation and scope of MIS.
Tanzania, BRN! & EP4R
• Increase in regularity and timeliness of capitation grants reaching schools.
• Increase in the number of schools holding remedial classes and extra lessons.
• Improvements in teacher deployment and allocation.
• Improvement in availability of publicly available examination data.
Sierra Leone, President’s Recovery Priorities
• Distribution of structured lesson plans to all primary and junior high schools.
• Construction of classrooms and WASH facilities.
• Reduction in number of unapproved schools.
• Expansion of school feeding programme.
These examples illustrate that, regardless of other claims made about it, the delivery approach can achieve timely and tangible results in easily measured (and sometimes previously neglected) areas.
from the country case studies shows that the delivery approach did have a significant impact in achieving results across a variety of areas related to educational inputs and outputs.
There is some evidence from the country case studies that the delivery approach has led to a greater focus on achieving results.
Punjab Education
Roadmap
• The Punjab Education Roadmap’s regular performance ‘heatmaps’ and stocktake meetings with the Chief Minister had a significant impact on officials throughout the delivery system. At the start of the Roadmap process the Chief Minister would summarily fire prominent officials on the spot from those districts which were performing poorly on the heatmaps.
• This led to a culture of fear throughout the districts and a relentless focus on achieving the specific performance metrics linked to the Roadmap. Whilst this led to increased accountability it also enhanced incentives for ‘gaming’ targets.
Haryana, Quality Improvement
Programme/Saksham
Haryana
Tanzania, BRN! & EP4R
• QIP and Saksham Haryana have undoubtedly ensured that there is a far stronger focus on learning outcomes than had previously been the case in the State. There is also a much greater emphasis on monitoring individual pupils’ learning levels on a regular basis and using this data to measure progress.
• It is not clear however that this increased focus has led to fundamental changes in teacher accountability.
• BRN! did not fundamentally change accountability mechanisms, performance appraisal or promotions across the education delivery system but it did have an impact through a ‘top-down’ bureaucratic focus on results and targets. These encouraged officials throughout the system to focus on measures to improve exam results although it had the perverse incentive of leading to some children being excluded from taking exams in poorer performing schools.
• EP4R uses financial incentives to improve performance at a district level but uncertain whether this is sustainable.
Sierra Leone, President’s Recovery
Priorities
• Led to some cultural change within the national Ministry of Education, Science & Technology (MEST) as Directors held weekly meetings with the Deputy Minister every Monday morning to report on progress against their designated activities (with the Minister meeting the President every Thursday). This did lead to staff in the national Ministry taking a greater interest in results but it had very little impact on accountability at sub-national level particularly as many of these activities involved NGOs and other third parties as implementing agencies. Meeting routines ceased once PRP ended.
The examples above show that, whilst the delivery approach can have some impact on encouraging a greater focus on accountability for results, there is a danger that it could lead to the introduction of a parallel accountability system which narrows the metrics used to judge success to such an extent that it encourages gaming and perverse incentives. Setting a pass rate target as a percentage (rather than an absolute number) for example led to some schools in Tanzania excluding the poorest performing pupils
• The cost of the significant number of external monitors (initially retired Army officials) was covered by the World Bank and these monitors were already in place before the Roadmap commenced in 2011. Initially the monitors used a paperbased reporting system but this was upgraded to an electronic system using tablets.
• Despite the change in government in 2018 these monitors remain employed and they are now mostly younger graduates of LUMS. Their cost is still covered externally (World Bank) rather than being absorbed by government.
• There was a focus on regular data collection and assessment including inspection visits to all schools every two months and the introduction of a dashboard based on competency-linked assessment carried out by teachers on their students six times a year. As these assessment are conducted by teachers rather than external assessors then the cost of gathering such data is lower than it would otherwise have been. Haryana has relied on external funding support from the Michael and Susan Dell Foundation to establish these data collection and analysis systems.
• BRN! did not establish a new data collection system but made use of existing data sources and introduced the requirement that districts and regions needed to complete a simple one-page reporting template electronically each month. This was low cost and did not rely on external funding but the quality of reporting was variable.
• EP4R incentivized government to take efforts to improve EMIS and collection and analysis of sub-national data, there is some evidence that this has led to improved data use at local level.
• The PRP made use of the UNICEF RapidPro SMS reporting system to gather data on a monthly basis from all districts and schools. This system was first used as part of the early warning network to combat the spread of the Ebola Virus Disease (EVD). It’s operation relied on a group of external ‘volunteer’ monitors who visited schools and entered data on key metrics which then went straight to the national Ministry and UNICEF. Districts themselves found it difficult to access this data as it was centralized and the cost of the system was borne by DFID. At the end of PRP the approach was discontinued as the Government had relied on external funds to cover the costs of the monitors.
Those variants of the delivery approach which introduced new data collection systems reliant on third parties (the monitors in Punjab and the situation room volunteers in Sierra Leone) came at a cost which was borne externally by donors and not integrated within the government system. The systems in Tanzania and Haryana are administered by existing staff (local officials and teachers) so the cost is lower and there is greater likelihood that they will be sustained beyond the existence of external funding.
Punjab Education Roadmap Haryana, Quality Improvement Programme/ Saksham Haryana Tanzania, BRN! & EP4R Sierra Leone, President’s Recovery Priorities. Globally application of the delivery approach has not tended to survive changes in political administration. In a number of cases this is because the approach or delivery units which personified it were seen as a high profile representation of what the incumbent President or Prime Minister believed in.
• In Tanzania BRN! and the accompanying institutional architecture of the President’s Delivery Bureau and Ministerial Delivery Units were abolished by President Magufuli when he came to power in later 2015 (it took many months for this to be communicated officially but in practice the PDB ceased having influence as soon as the new President came to power). Elements of the delivery approach have continued through EP4R and through a group of officials within the education system who continue to approach issues with a delivery mindset.
• In Sierra Leone the President’s Recovery Priorities (PRP) were always intended to be a short-term initiative which ended in 2017. There was then a change of government in 2018 which led to some major changes in the education system including the division of the Ministry of Education, Science and Technology (MEST) into two separate Ministries representing basic and higher education. As of 2019 there is very little evidence remaining of the impact of the PRP, even to the extent that DFID invited Sir Michael Barber to visit Sierra Leone and talk to the new Minister of Education as to how ‘deliverology’ may be used to achieve results within the sector.
• In Punjab the architecture of the Roadmap remains in place despite a change of government in mid-2018. In part this is because the PMIU and data collection mechanisms are both elements of a programme of assistance and technical supported provided by the World Bank which has bridged the two administrations. Elements of the approach are likely to survive the change in administration although they may be adapted and altered in some ways (which are not yet clear) over the coming years.
• In Haryana there was a change in ruling party at State level in October 2014 (with the Quality Improvement Programme being initiated earlier in 2014) but this did not impact the implementation of the programme which was led by the Department of School Education. Next elections are October 2019.
• In Ethiopia the change in Education Minister early in the process was a major setback to the effective application of the delivery approach.
• In Malaysia PEMANDU was abolished when a new government came to power in May 2018. The incoming government had been critical of PEMANDU whilst in opposition describing it as having failed on its main KPI and failing to be accountable to Parliament and the public.
• In the UK the PMDU was abolished in 2010 by a new government which saw it as representing the previous administration’s approach to public service delivery. A similar unit (under a different name, the Implementation Unit) was then re-established in 2012.
Within a functional democracy transitions of power and new approaches to service delivery are an inevitably. Rather than being a permanent fixture delivery units and the delivery approach may be an important catalyst to longer-term system change by harnessing energy to focus on under-served issues and address problems where the solutions are not technically complex. Having a temporary (e.g. 3 to 6 year) timeframe for the application of such processes may be one way of harnessing their benefits without creating long-term parallel systems which do not actually strengthen education systems. We should not view the abolition of a delivery unit as direct evidence of failure.
. Some applications of the delivery approach had a specific focus on promoting private education whereas others were more concerned with government schools.
Roadmap
• The Roadmap encouraged the promotion of private education through the Punjab Education Foundation. Significant resources were committed to PEF and it achieved encouraging results. The government viewed private education as an important means of increasing enrolment and attainment within Punjab and the Roadmap played a role in making this happen. The new government, elected in mid-2018, cut PEF’s funding as they saw the approach as being closely aligned with the previous Chief Minister.
• Private education plays an important role in Haryana but it was not an explicit focus of the initial Quality Improvement Programme. Much of the ASER data used to measure success only uses results from government schools.
Haryana
• There has been a slight reduction in the proportion of pupils enrolled in private primary schools over the course of the programme but this figure still stands at 55% as of 2018.
• BRN! did not have an explicit focus on private schooling. Private education has not historically been encouraged in Tanzania due to the prevailing ideology on public goods first initiated by President Nyerere at independence. Private schools were included within the BRN! targets and, after some discussion, it was agreed that some private schools would be recognised through the school incentive scheme (although excluded from monetary rewards).
• The Ministry of Education, Science and Technology (MEST) was, at the time of PRP, focused almost entirely on supporting the government system rather than considering private education. Private schools (prevalent mostly in urban areas in Sierra Leone) were seen as problematic as many government teachers would also be employed as private school teachers whilst still collecting their salaries. Private schools were therefore not included in PRP activities.
The delivery approach is a set of tools and techniques which can be used to promote private education if the incumbent administration wishes to do so. The example of the Punjab shows that the private sector can respond to the system of targets, incentives and hard and soft levers which are typically deployed through the delivery approach. Countries looking to focus the delivery approach to promote or improve private education should consider which of these tools and techniques are more likely to be effective given the context e.g. top-down bureaucratic pressure is liable to be less effective when deployed with private sector actors as it is when applied to sub-national governments and officials.
Punjab Education Haryana, Quality Improvement Programme/ Saksham Tanzania, BRN! & EP4R Sierra Leone, President’s Recovery PrioritiesImprovements in learning outcomes rely on the quality of interaction between teacher and student. The diagram below from Cambridge Education sets out the multi-faceted and inter-related nature of teacher performance and the range of factors which need to be addressed in order to shift systemic capacity to improve learning outcomes. The boxes in red assess the extent to which applications of the delivery approach have addressed each of these factors.
Generally a high priority
This has been the major focus of most applications of the delivery approach- using regular performance data to hold officials to account for results and to clarify responsibility for specific activities through roadmaps and delivery plans. Accountability can be either low stakes (naming and shaming, rankings etc.as in Tanzania and Sierra Leone) or high stakes (leading to dismissal or disciplinary procedures as in Punjab).
Generally a low priority
There have been efforts to promote merit based recruitment and promotion in Punjab but generally applications of the delivery approach have paid less attention to teacher pay, promotion and conditions. In Tanzania efforts were made to reduce salary arrears but there are few examples of systemic changes in this area.
Generally a medium priority
Most applications of the delivery approach have seen a strong focus on in-service teacher training, recognising that there are capability and capacity gaps which must be addressed if learning outcomes are to improve. Such initiatives have not fundamentally changed the structure of teacher training nor addressed initial teacher training (PRESET) where the longer term timeframe to achieve results has seen it generally overlooked in delivery plans.
Generally a low priority
These factors are critically important but less tangible to measure and do not lend themselves to easily quantifiable outputs. As a result they have not really feature in applications of the delivery approach. BRN! did have an activity called ‘teacher motivation’ but this was restricted to measuring payment of salary arrears.
. Whilst it has clearly proved effective in achieving short-term results across important areas of the education system in a number of countries the evidence of the effectiveness of the delivery approach in bringing about sustained improvements in learning outcomes is far from conclusive.
This may be because, in relation to the ‘teacher performance’ diagram on the previous slide, the delivery approach has generally focused on shifting metrics concerning accountability and responsibility, combined with the roll-out of large scale inservice teacher training programmes (and corresponding materials production and distribution). There has generally been less focus on longer-term changes to initial teacher training systems, teacher promotion, recruitment, pay and conditions and other measures to raise the status of the teaching profession and raise morale and motivation. These issues are generally quite complicated and will take a number of years to show tangible results so are often overlooked when countries develop initiatives and metrics where progress must be demonstrated in a matter of months.
The delivery approach can thus play an important role in helping to bring about system reform and strengthening by focusing on a small number of priorities and driving improvements, shining a ‘laser beam’ onto system inadequacies which have remained hidden for decades and addressing issues at school and district level which have constrained national progress. The delivery approach can help to catalyse change but it cannot, by itself, bring about widespread systemic reform as such reform involves multiple areas (not a small number of measurable priority programmes) and longer-term changes.
In countries where the key ‘enablers’ are already in place and system capacity is relatively high the delivery approach has had some success in addressing more complex issues as well as simple measures such as improving delivery of inputs (which are still important in their own right). In such higher capacity contexts the approach has been useful in addressing specific issues and bringing about improvements but this is a very different matter to attempting to use the delivery approach to comprehensively reform a failing system. In summary, for countries with capacity constraints, the delivery approach has merit in enabling governments to focus on priority areas in an Education Sector Plan and achieve against some key priorities which lend themselves to simple and regular metrics. We would not recommend using the delivery approach to try and implement the Education Sector Plan in its entirety- the education system may become exhausted with the data and performance management requirements whilst prioritization will not be possible. A rigorous performance management routine can also be most effective when it is time-limited- meaning that delivery units do not necessarily need to become permanent fixtures.
When looking at the initial set of public service delivery challenges mentioned at the start of this paper one of the challenges mentioned was “focus in government on process and procedures rather than outcomes.” Studying various applications of the Delivery Approach it is clear that processes and procedures are necessary pre-conditions to achieving outcomes. Processes and procedures are important to establish effective performance management routines and ensuring that data is analysed and used regularly to inform decision-making. Without this focus on routines there is a danger that the work of a Delivery Unit is conducted on an ad hoc basis or that it’s analysis does not link in to the systems which it is ultimately trying to improve.
The challenge then for government is not necessarily to remove processes and procedures but to ensure that these processes and procedures are focused on achieving outcomes rather than being seen as an end in their own right.
Whilst the evidence in this report suggests that the Delivery Approach has been more effective in achieving results for technically simple issues where progress can be measured regularly and unambiguously it is worth stressing that this does not necessarily mean that it cannot be used to improve learning outcomes. We would recommend that countries should examine the evidence which exists within their specific context as which technically simple and measurable activities and outputs may contribute towards improved learning outcomes. The Delivery Approach can then be used to oversee a programme of activities which, according to evidence, should lead to improved learning outcomes if implemented effectively. The advantage of this approach is that the targets and incentives used to hold institutions and individuals to account will be clear and measurable. If these measurable results are achieved and the logical linkages between outputs and outcomes holds then learning outcomes should improve.
The evidence presented in this report is inconclusive on the effectiveness of the Delivery Approach in achieving improved learning outcomes. Some of the applications of the Delivery Approach did not have an explicit focus on learning outcomes whilst others were only implemented for too short a period of time to have an impact on learning outcomes. It should be noted that, in cases such as Haryana, there was an improvement in learning outcomes over time, just that the scale of this improvement was not significantly greater than that experienced in other States.
.
• The delivery approach can be implemented effectively without establishing a separate Delivery Unit, embedding the approach within existing structures and systems is preferable if sufficient capacity exists.
• The delivery approach is not a guaranteed solution for system strengthening but it can be an effective way to kick-start performance improvements and deliver measurable change in clearly defined areas of the education system.
• The delivery approach is well suited to technically simple issues where progress can be measured regularly and unambiguously. The delivery approach tends to be less effective on complex, systemic change issues.
• There is some evidence for the delivery approach supporting short-term improvements in learning outcomes but the evidence is not conclusive, improvements may not be sustained and there is a risk of unintended consequences.
• Effective delivery systems require strong enablers not just process. Delivery approaches may not necessarily strengthen these enablers as they are focused on short-term results which often requires parallel systems to initiate rapid change in systems where the enablers are weak.
• Delivery approaches do not survive political transitions well but they may not need to, the first 3-6 years could be where the biggest value comes from rather than seeing them as a permanent presence. They can be used to kick-start performance improvements which are then sustained by the system as a whole.
The next section of this report provides details, analysis and evidence from 5 country case studies (Tanzania, Sierra Leone, Punjab, Haryana and Ethiopia). Each of these case studies is structured as follows:
• A brief summary of the country context and approach and an assessment of how closely it adhered to the five success factors identified on slide 8.
• A headline assessment of the impact on learning outcomes in each country.
• Analysis of the motivation and context for adopting the delivery approach.
• Critical Success Factor 1: System Leadership and commitment
• Critical Success Factor 2: Prioritisation
• Critical Success Factor 3: Data and routines
• Critical Success Factor 4: Understanding and analysis
• Critical Success Factor 5: Accountability
• Assessment of the extent to which critical system ‘enablers’ were in place prior to implementation.
• Summary of distinct features of approach taken and conclusion on their relative effectiveness.
Following the five country case studies we then provide shorter summaries of noteworthy applications of the delivery approach in the UK, Malaysia, Colombia, Indonesia and Ghana.
• Big Results Now! (BRN!) began in 2013 under President Kikwete as a way to fast-track system improvements before the end of his term in office 2015. It involved the creation of a Presidential Delivery Bureau (PDB) and Ministerial Delivery Units (MDUs) following the PEMANDU model.
• BRN! had very strong communications and developed data and routines which emphasised the importance of learning outcomes in schools – particularly in primary schools.
• BRN! was supported by a donor-funded results based financing programme called EP4R which commenced in 2014. BRN! and related structures were officially abolished in 2016 when President Magufuli took office but E4PR has continued until 2019 with a gradual change in process, priorities and support. This meant that, within the Ministry of Education, there was continuity and a focus on adapting the delivery approach to achieve the new administration’s priorities and EP4R results. .
• EP4R has focused on understanding delivery, problem solving and cultural change through financial incentives rather than BRN!’s focus on monthly reporting, data systems and ‘naming and shaming’.
• BRN! relied on attention from the President and Minister to drive action, whereas EP4R uses resultsbased financing. Both approaches have relied on external technical assistance to provide new ideas.
• While learning outcomes (expressed as percentage exam pass rates and achievement against EGRA and EGMA) improved during the initial BRN! period subsequent performance has been impacted by the 2016 introduction of fee-free education which caused a huge enrolment spike that overwhelmed the system.
• Big Results Now! (BRN!) introduced in early 2013 to deliver transformational change by 2016 in 6 sectors including education.
• Overseen by a Presidential Delivery Bureau (PDB) headed by Omari Issa, former Chief Executive Officer of the Climate Facility for Africa and Executive Director of Celtel. DFID committed £39 million of funding for technical advice from PEMANDU to establish and operate the PDB and Ministerial Delivery Units (MDUs), run Delivery Labs and fund other operating costs.
• Based on the PEMANDU approach of Delivery Labs and 3 Feet Plans. Heavily reliant on embedded experts from PEMANDU working in the PDB. Ministerial Delivery Unit (MDU) established in the Ministry of Education with one external recruit and four existing Ministry staff. Cambridge Education also provided embedded technical assistance following a request by the Minister of Education, Dr Shukuru Kawambwa.
• Broad set of 9 activities (set out on slide 49) in education aimed at improving primary and secondary pass rates and attainment levels in the early grades. These activities were developed in a 6 week Delivery Lab involving multiple stakeholders.
• Some activities were poorly conceived and had little direct impact on learning outcomes (School Improvement Toolkit) or were not effective in addressing systemic weaknesses (teacher claims) whereas others gained much more traction and were more impactful (STEP extra classes, focus on early grade reading, writing and arithmetic, school ranking). Between 2013 and 2016 there was measurable improvement in primary and secondary examination passes (PSLE & CSEE) and Early Grade Reading Assessment (EGRA) scores.
• One of the great strengths of BRN! was the extent of its communications- members of the public knew about it and even the most remote rural schools realised that their performance was being scrutinised (they joked that the programme should be called Better Resign Now). The publication of school and district rankings (based on PSLE and CSEE exam performance) meant that schools and districts had an incentive to improve as they did not want to feature near the bottom of the rankings. Annual prizes were awarded by the President to the most improved districts and schools. Focus on improvement rather than overall best performance gave hope to remote schools that they could be recognised.
• Led to improvements in information flow between districts, regions and national government. Direct to school capitation system introduced although this required the 2015 change of President and his personal/political intervention to start ‘Fee Free Education’ policy.
• Development of a group of officials within Ministry of Education with a new mindset and approach, no longer business as usual.
• 3 Feet Plans proved too prescriptive for Tanzania’s decentralised education delivery system. Failure to hold Presidential BRN meetings eventually undermined the linkage between presidential authority and reporting structures. The Ministry of Finance was unable to meet funding commitments – a key delivery constraint.
• Linked to a BRN Education Programme for Results (EP4R) financing instrument which, co-funded by DFID, the World Bank and the Swedish Government, made over $250 million available to the Government of Tanzania’s education sector from 2014, initially for BRN! and then for other identified priorities.
System Leadership & Commitment
Prioritisation and resourcing
Strong initially but weakened as PDB meetings not held
Strong prioritisation of objectives but many activities had weak resourcing (until EP4R)
• Initial Presidential support for BRN!
• Incentives have driven system leadership on EP4R priorities
• Clear priorities developed for BRN! on exam results and early grade learning outcomes.
• EP4R resources made available through achievement of Disbursement Linked Indicators (DLIs) meant that priorities are well understood.
• Vulnerable to change of President or removal of incentives.
Data, information and routines
Understanding of delivery and problem solving
Strong routines, weak system data
• Strong monthly routines in BRN!
• EP4R focused on annual performance evaluations linked to payments
• BRN! activities developed to achieve the prioritised outcomes were quite varied and implementation of some activities lagged due to lack of resourcing.
• Better understand of EP4R priorities but less visible focus on overall learning improvements
• BRN! data routines seen as a compliance action by some and weaknesses of data collection and quality assurance at local level.
Communication, Accountability and culture change
Weak understanding beyond TA teams
• Senior involvement at the beginning in BRN!
• EP4R problem solving was more gradual and involved front-line actors to a greater extent than in BRN!.
• BRN! problem solving was very centralised with limited involvement of decentralised actors.
• EP4R still relied heavily on external TA for problem solving.
Strong and clear communications, no shift in accountability measures but results focus had short-term impact
• Communication about the need to improve results was central to BRN! success – very clear messaging. Had impact on behaviour as districts and schools were publicly ranked, no one wanted to be bottom.
• Communication in EP4R about the consequences of results – money – also clear
• Short lifetime and external units limited BRN! impact on accountability in longer term.
• Accountability mechanisms for teacher promotion, pay etc. unchanged in BRN!
• EP4R behaviour change driven by incentives – is culture change embedded if incentives were removed? Unlikely.
BRN! had outcome level targets for the period between 2012 and 2016 for i.) improvements in Primary School Leaving Examinations (PSLE) pass rates where the rate increased from 31% in 2012 to 68% in 2016; ii.) improvements in Certificate of Secondary Education Examinations (CSEE) pass rates where the rate increased from 43% in 2012 to 69% in 2016 and iii.) improvement in EGRA where the proportion of Standard 3 non-readers declined from 27.7 in 2013 to 16.1% in 2016. More recent evidence shows that exam pass rates have improved at a slower rate since 2016 whilst EGRA/EGMA results have fallen, likely due to the significant impact of enrolment surges since 2016 due to the introduction of fee-free education.
• There can be no doubt that exam results have improved since BRN! although they had reached an all-time low in 2012. It is also difficult to compare relative performance year on year because of changes in assessment methods. The EGRA and EGMA results provide a more methodologically sound measure of learning outcomes and the positive change between 2013 and 2016 was very encouraging.
• The largest improvements in EGRA and PSLE results were in the poorest performing regions who also received support from the DFID-funded EQUIP programme during the period between 2013 and 2019.
• One concern is the potential impact of the Fee Free Education policy introduced in 2016 and the enormous increase in primary classes sizes. Recent EGRA and EGMA paint a less positive picture and it seems the system has struggled to continue improvements with this strain to the system (average primary class sizes jumped from 60 to 90)
“EGRA and EGMA is controversial. The recent results have dropped back to 2013 baseline.”
EP4R TA Team LeaderWhilst we can have a fair degree of confidence that the improvement in Early Grade Reading Assessment (EGRA) results between 2013 and 2016 (as set out on the previous slide) was statistically robust can we have the same level of confidence about the Primary School Leaving Examination (PSLE) and Certificate of Secondary Education Examination (CSEE) results?
Looking at the PSLE results it is striking that there is a very significant drop in pass rates between 2011 and 2012- from 59% to 31%. It was this drastic fall in results which helped to precipitate BRN! in the first place. 2012 saw a major change in the way in which PSLE examinations were set and marked with the introduction of multiple choice tests assessed using Optical Mark Recognition (OMR) equipment. This change may have precipitated the significant fall in pass rates as i.) it potentially reduced the scope for cheating as papers were no longer marked manually and ii.) the use of pencils and forms to input answers was unfamiliar and may have confused some candidates. Setting aside the 2012 results and looking solely at 2013, 2014 and 2015 we do see that there is a steady increase in pass rates – 50%, 57% and 58%. This improvement appears genuine but there are issues about the comparability of results year-on-year and the World Bank didn’t consider results sufficiently robust for disbursement linked financing when designing EP4R.
Analysing the number of candidates sitting PSLE annually between 2012 and 2015 there was a steady drop year on year- from 865,534 candidates in 2012 to 763,602 in 2015- almost 100,000 fewer candidates in a period of just three years. Consequently whilst the absolute number of candidates passing PSLE increased each year the increase was not as spectacular as the percentage pass rate figures would indicate. Still, it is important to note that the 518,034 candidates who passed PSLE in 2015 was almost double the 265,873 who passed in 2012.
A recently published (February 2019) RISE paper on the impact of BRN! school rankings on PSLE results found that these rankings, despite being classed as low-stakes accountability reforms (as there were no financial consequences for low performance) did have an impact on performance of schools in the bottom two deciles of their district.21 This was despite the rankings only having ‘top-down’ bureaucratic pressure on schools through the Ministry of Education, PMO-RALG, DEOs and headteachers as the rankings were not widely publicised to parents and community members. This research paper found that PSLE scores in schools in the bottom two deciles increased by 5.7 percentage points whilst there was no corresponding change in scores in the schools in the top two deciles. The study also found that these schools in the two bottom deciles were excluding an average of two students per school from the final year of primary school (Grade Seven). The study found that both the positive examination gains and negative enrolment effects appear to be driven by the district and school rankings not by other reform components.
What degree of confidence can we have that the improvement in Certificate of Secondary Education Examination (CSEE) pass rates between 2012 and 2016 was genuine? Whilst there was improvement this was less impressive when expressed as a total number of passes rather than as a percentage increase due to the reduction in candidates taking the CSEE from 2014 onwards.
At secondary level national ‘Form 4’ CSEE pass rates had fallen to 43% in 2012 driven in large part by extremely rapid increases in enrolment since 2006. Progress in CSEE pass rates showed significant improvement since the introduction of BRN! with the pass rate increasing to 57.1% in 2013 and then 69.8% in 2014, before falling slightly to 68.5% in 2015. This meant that the 2015 CSEE pass rate was 11 percentage points behind the aspirational BRN! target of 80%. It is also very important to note that the methodology for marking CSEE changed in 2014 thus weakening the comparability of results year on year.
What the pass rate graph also doesn’t show is that there was a significant drop in the number of candidates sitting the national CSEE at the end of Form Four in 2014 as a result of the introduction of a new requirement to pass exams at the end of Form Two in order to progress to Form Three. This policy was introduced prior to BRN! and had a significant impact on the 2014 figures. While the overall pass rate increased significantly (from 57% in 2013 to almost 70% in 2014) the actual number of candidates passing CSEE declined from just over 200,000 in 2013 to 168,000 in 2014. When viewed in terms of actual number of students passing CSEE 2015 was by far the most impressive year. The actual number of passes in 2015 was 243,200 as the overall number of candidates sitting the exam rebounded to 336,800.
Some of the biggest gains in PSLE results in Tanzania were in regions supported by the DFID-funded EQUIP-Tanzania programme but these gains were still broadly in line with national improvements. According to analysis of World Bank Service Delivery indicators, it was a very different story at early grades (or 3Rs), which was the core focus of the EQUIP-Tanzania programme as well as a focus for BRN! This data shows that the EQUIP-T regions massively outperformed other locations.
While overall national trends were positive, not all regions improved, in fact performance in some declined. EQUIP-T regions accounted for the top 6 performing regions and 7 of the top 8. The only non EQUIP region in the top 8 was Mtwara which hosted a USAID literacy programme during this period. This data suggests that, in early grades, while attention from BRN! was sufficient to improve learning in some regions, the interventions from EQUIP-T may have been a more significant factor in improving early grade literacy outcomes.
• The use of delivery approaches in the Tanzanian education system began in 2013 when the then President Kikwete announced the Big Results Now! Programme to accelerate progress in key sectors.
• The approach aimed to achieve rapid results in the final two years of Kikwete’s Presidency and was based closely around the experiences of PEMANDU in Malaysia, involving the use of delivery labs, the creation of a Presidential Delivery Bureau and Ministerial Delivery Units and a focus on regular progress reporting.
• After President Magufuli came to power at the end of 2015, he rejected what he saw as parallel systems, disbanded the units and merged their functions into Ministry departments. While some elements disappeared, others remain in different form and under the World Bankfunded Education Performance for Results (initially linked to BRN!) a new style of delivery has emerged based around incentivisation for national and district performance improvement in priority areas.
BRN! And EP4R
TanzaniaBRN! and EP4R have demonstrated system leadership in different ways. Whereas for BRN! the systemic leadership came from the very top and put pressure on others to achieve, EP4R has used financial performance-related incentives to build system leadership from the Local Government (LGA) level upwards.
• BRN! was President Kikwete’s flagship policy in his final 2 years in office. His personal involvement generated interest and support from donors, media attention and commitment from Ministers and Regional officials. Kikwete’s personal focus fell away after a year of implementation and progress meetings were delayed meaning that the Ministry of Education was often left alone to drive improvements.
• This initial commitment from the top generated a focus through the system on improving 3Rs (early grade) exam results, even though the methods to achieve this by BRN! were less well understood.
• EP4R has not had the same level of publicity and commitment from the highest levels of Government. The Minister is disconnected from the process, but the Permanent Secretaries are on board and ensuring the Ministries are focused.
• The financial incentives structure in EP4R has helped to build a different kind of system leadership that is more focused on Regional and Local governments focusing on delivering results in order to receive additional decentralised funding. This is more about them delivering results for their own benefit rather than responding to top-down pressure.
“Publicity was high for BRN! There was lots of attention from all education stakeholders. You can see the results were increased during that period – big increases.”
Regional Education Officer“The central Ministry was enthusiastic at the start with BRN!, but now is fatigued – some of the best people have moved on. Within MOEST there is a weakening of capacity and impetus to get things done. PO-RALG is more open to taking risks and has been more effective. But while we are seeing that priorities focused on centre are under-performing, LGA results are improving. The high turnover of staff at local levels is frustrating and repetitive but also provide new energy. The progress with LGAs is increasing, year by year.”
EP4R Technical Team LeaderUnder BRN! there were external delivery units supporting individual units and the overall approach. This external structure has been removed under EP4R with all elements, including technical assistance support, being embedded within existing systems. BRN!’s response to a comparatively weak delivery system (by Malaysian standards) was to create a strong central unit (PDB) and then look to establish units at Ministerial (MDU) and even regional (RDU) level (although the latter were never really functional). The prescriptive BRN! focus on detailed 3 Feet Plans did not really work in Tanzania’s decentralised education system as central accountability was unclear and plans quickly became out-dated. The Ministry of Education’s Delivery Unit (MDU) did not focus on delivering these 3 Feet Plans (as prescribed by the PDB) but instead focused on establishing a monthly reporting system to track and provide feedback on Regional and LGA progress.
• BRN! set up a President’s Delivery Bureau and Ministerial Delivery Units to support and monitor BRN! implementation.
Ministry of Education/ PMO-RALG
Central Ministries
TA team
• The majority of staff in the MDU were preexisting ministry staff who were ‘doublehatting’.
• Under EP4R the focus on implementing priorities came from key positions within the Ministry, e.g. Director of Policy and Planning with support from an embedded TA team. There were no new structures created.
25 Regions and 163 LGA
Front line staff
• While the visible structure changed in other areas there was significant continuity between BRN! and EP4R with a member of the TA Team having previously provided embedded support within the MDU.
Front line staff
President, State HouseBRN! and EP4R took different approaches to agreeing priorities. The initial BRN! priorities were developed in a 6 week Delivery Lab involving 34 participants from a wide range of organisations including MoEST, PMO-RALG, NECTA, Tanzania Education Authority, various universities and teaching colleges, development partners and civil society. These priorities then formed the basis for ongoing discussions and negotiation between the Ministry of Education, World Bank, DFID and SIDA to establish an initial set of BRN! EP4R ‘Disbursement Linked Indicators’ (DLIs). With the abolition of BRN! a second set of DLIs was developed in late 2016 which focused on similar priorities to BRN! but attempted to shift focus towards systemic improvements (where BRN! had not had much impact). Activities under both BRN and EP4R were relatively broad in scope.
BRN – Delivery Lab priorities
EP4R Disbursement Linked Indicators (phase 2)
1. Foundational activities (a series of strategies to be developed to achieve the other indicators)
2. Timely and adequate resource flows – notably direct capitation grant flows to schools
3. Improved data management
4. More efficient teacher allocation
5. School incentive grants
6. Improvement in (early grade) learning outcomes
7. Improvements in pupil survival and transition rates
8. Improvements to the Quality Assurance system
BRN! and EP4R have also taken different approaches to understanding delivery systems and problem solving. While BRN developed detailed 3 Feet Plans for all activities up front during a 6 week Delivery Lab, EP4R has taken a more gradual approach that allowed for more considered initial problem solving.
BRN
• Plans were developed in the Lab and the PDB and MDU’s role was to oversee effective implementation.
• Some of the plans did not take account of challenges with local delivery systems or capacity and lacked understanding of root causes of issues. This subsequently became the focus of problem solving efforts.
• ‘3 Feet Plans’ were so detailed that they quickly became outdated. Officials who had not been present in the Lab felt that Plans were ‘given to them’ with no ownership in their development.
• Some plans were also not clearly linked to available funds or appropriate technical assistance. As such the quality and speed of implementation varied considerably by activity area.
EP4R
o “The BRN! plans were developed in the Lab. Some of them were too ambitious and so we had to change e.g. a 1,200 target reduced to 500. Some plans didn’t understand the context, didn’t realise how long the government procurement process was or they didn’t think about the effect of the rainy season on things.”
o BRN! PDB Education Manager• EP4R approaches were developed and altered over a longer period of time.
• National government had access to external technical assistance when developing plans but also focused on partnership building with central and decentralised officials to agree realistic and implementable approaches.
“There is not the problem-solving capacity within the system to come up with problem solving solutions – so financial incentives only would not lead to the same kind of results we’ve seen. We need to be able to bring in new ideas/new ways of thinking into the system – can send teams out or bring things in. Nobody has the experience of doing anything differently. There are a few exceptional people, – but that’s just it, they are the exception.”
EP4R Technical Team Leader
Front line workers have been involved in developing solutions for both BRN! and EP4R but whereas this was largely an unintended consequence of BRN! it has been a much more deliberate ploy for EP4R.
• Front-line workers were not heavily involved in the BRN! Delivery Lab where 34 selected officials discussed, analysed and decided upon the priority areas and interventions.
• However, due to the success of the communications on BRN! and the fact it was supported by the President a number of front-line workers developed their own solutions to improve learning outcomes and exam results. These solutions, such as remedial classes for struggling students or tracking of the number of pupils who ‘can’t read’ by districts, were part of BRN! but lack of resources meant no clear approach was developed. Schools therefore took their own disparate approaches to implementation motivated by a desire to achieve results– this meant that local solutions were developed outside of the 3 Feet Plans.
• EP4R has been much more deliberate in involving front-line workers in problem solving. From conducting field visits to discuss Quality Assurance reform with Ward Education Officers and Inspectors to discussing ways of making teacher allocation processes more equitable within districts there has been a focus on developing solutions that will work given the individual contexts.
• The provision of financial incentives for decentralised levels (districts and schools) not just the central Ministry has also mean front line workers have more ‘skin in the game’ and understand how reform can benefit them more directly.
“In reforming the Quality Assurance system we placed some district inspectors at the heart of the process and helped them to build something that was implementable in the context with some technical assistance. What emerged was a solution that integrated available resources in schools, wards and district that they see as their own to roll out…it’s a workable solution.”
EP4R Technical Team Project Manager“No-one is willing to lose the money from EP4R. Incentive to strengthen in priority areas, solve problems and achieve results”
Regional Education Officer
BRN! established regular and structured reporting mechanisms to track implementation and held regular meetings to review and discuss progress. However, whilst these were sometimes useful, the PDB never really understood the decentralised delivery system in the Tanzanian education system and insisted on weekly performance resolution meetings with the MDU (as was the case in Malaysia) whereas data from LGAs was only available on a monthly basis. Under EP4R the performance management processes have become less structured or frequent but more system-led.
BRN!
• Weekly processes: PDB meet with MDU to identify delivery issues and develop solutions.
• Monthly processes
• Monthly progress update sent from LGAs to Regions copying in the MDU (one page Excel reporting template produced which was submitted by email). Regions to report for their LGAs against all KPIs.
• MDU reported to Minister and PDB.
• MDU meet with PDB and NKRA Steering Committee to review progress and agree areas for follow up.
EP4R
• Less structured or regular than under BRN!
• Within system reporting – no external units
• Weekly update meetings with Ministry
• Reporting from regions to government less structured and usually quarterly
“We don’t do monthly reporting now, but we still track progress and report into the government. Usually on a quarterly basis. I can’t say we miss a lot from the BRN! monthly reporting.”
Regional Education Officer
“At the start of BRN! we used to have a weekly catch up, it wasn’t problem solving, it was an update/staff meeting and could take the whole day. BRN! help focused meetings on 2-3 issues for problem solving. Weekly meetings in the Ministry nowadays are back to an update type approach I think.”
PDB Education ManagerBRN! involved high levels of external inputs particularly at the start of the process from PEMANDU consultants and advisers and the establishment of new structures. EP4R input is more limited and instead focuses on incentives connected to external results-based financing- the cost of this results-based financing is highly significant however (approx. US$250 million) and is in excess of the resources spent on BRN!.
Tanzanian Government Input
BRN! Presidential and Ministerial time
Staffing of majority of MDU. Resourcing of most BRN! activities
External input
Technical assistance from PEMANDU. Staffing of PDB and limited external staffing of MDU.
PDB staffing and labs funded by DFID.
Alignment of Budget and Delivery priorities
Funding was a concern – activities were not all planned & costed based on available budgets leading to resource gaps. Other donor-funded programmes already working to support grassroots primary-funded education (funded by DFID, USAID and GPE) were requested to support some BRN! interventions.
“We had a challenge with budget. We had no budget for some activities. The Director of primary couldn’t reallocate budget so had to take it to PDB and then raise with Minister for re-allocation.”
PDB Education Manager
“Managers relied on PEMANDU staff to perform functions that should have been fulfilled by PDB staff. This meant that PEMANDU staff frequently acted as substitution of capacity, rather than being able to focus on transferring skills and capability to Tanzanian counterparts.”
EP4R Ministry staffing Technical assistance team within the Ministry, funded by DFID. Limited budget for support activities.
Largest input was in the form of results-based finance from the World Bank, DFID and SIDA
Initial challenge to get activities started without immediate funds. Strong alignment as the priorities were linked to financial disbursements and so achieving the results would provide the Ministry and LGAs with additional funding.
“We didn’t have any money to implement initially. So EP4R was unpopular then. We agreed foundational activities to provide seed funding and it has worked since then.”
MoEST Senior Official
Both BRN! and EP4R have used communications to try and bring about change. BRN! was effective in focusing stakeholders on improving national results whereas the EP4R incentives seems to have promoted more local focus on the improvement of specific systems.
• BRN! came after a period of national coverage about bad exam results and was clearly seen within regional and local government as about improving exam results.
• This focus led to schools being encouraged to take additional measures to support improvements in results, such as holding additional classes for struggling students despite this not being implemented by BRN! in a consistent manner. This was a positive achievement for BRN! communications as results were achieved without committing any dedicated resources.
• A World Bank survey from 2016 found that 82% of headteachers felt that BRN! had made their school more focused on exam results. The same survey found that almost 70% of interviewed teachers had claimed to hold additional classes for students to enable them to catch-up on topics where they were behind and to prepare for exams.
• EP4R has used communication about the incentives on offer for improved system functioning in particular areas to build focus on change in specific identified priorities e.g. teacher allocation
“BRN! was focused more on performance in exam results, especially for KKK (3Rs). W helped identify those schools not performing. The legacy of BRN! can be seen there.”
Regional Education Officer“In the region we were motivated by the trophy from being number 1 . Our ranking on exam performance was an incentive. To keep Katavi in the top 10 was our motivation, even after BRN!.”
Regional Education Officer
“Publicity was very high for BRN! This raised expectations and pressure. EP4R is not the same. It
more about improving internal processes. EP4R assists us with money when we meet the targets of the initiatives. We receive money when we present timely data, we get money for acceptable PTR…”
Regional Education OfficerBRN! did not fundamentally change accountability mechanisms, performance appraisal or promotions across the education delivery system but it did have an impact through a ‘top-down’ bureaucratic focus on results and targets. By contrast EP4R uses financial incentives to encourage improved accountability.
• BRN! set a series of national targets which were then disaggregated to regional and district levels. Regional Education Offices (REOs) and District Education Offices (DEOs) then signed performance contracts with the Ministry of Education and PMO-RALG committing to achieve those targets.
• To the best of our knowledge nobody was fired or dismissed for failing to achieve a BRN! target. There were plenty of rumours about the importance of achieving the targets (in Districts BRN! was jokingly said to stand for Better Resign Now) but the approach taken was more of ‘name and shame’ for poor performance rather than more punitive disciplinary measures. This naming and shaming did appear to have an impact on behaviour at school, district and regional level as BRN activities such as extra classes were carried out on a large scale without the provision of financial incentives.
• BRN! did not make any changes to the system of teacher recruitment, promotion and performance management- it remained as difficult to dismiss a teacher for poor performance in 2016 as it did in 2012- but it did bring about a temporary reduction in outstanding teachers claims as BRN! funds were committed to reducing arrears (although BRN! did not effectively address the structural issues which caused the build-up of claims in the first place).
• Under President Magufuli (elected in November 2015) there was a much stronger focus on performance management and accountability. Although this was not specifically linked to EP4R, implementation did benefit from this commitment to work.
“We can hire and we can fire” Regional Commissioner of Kigoma in 2016. The fear of being sacked was a new dynamic in the civil service and a large motivator in 2016. It became a paralysing factor too at times though, with people scared to make decisions for fear of making the wrong one.
• EP4R used positive (rather than punitive) financial incentives to encourage accountability and behaviour change, this appears to have had some positive effects at district level.
“We used to say that BRN stood for ‘Better Resign Now’ as that is what we thought we would have to do if we did not achieve the pass rate targets.” District Education Officer
BRN! achieved some rapid results in terms of exam result improvements but most of the specific processes established faded quickly after 2016. EP4R seems to be having more of an impact on establishing results-focused cultures but whether it will survive the removal of incentives is unclear.
• BRN! aimed to achieve rapid results in two years. It aimed to do this by focussing on specific objectives but arguably it’s biggest success came from the Presidential and Ministerial communication about the need to improve exam results.
• One area that does seem to have sparked some change is the publication of rankings. Several local government officials said these public rankings were a source of motivation to improve. The ranking is an initiative that has survived BRN!
• EP4R is certainly getting regions and local government authorities to think about results rather than process much more in some key areas. This is the start of a cultural change, but it is not so embedded yet that it would survive the removal of the financial incentives that is motivating these changes.
“The ranking and recognition was a big motivation for teachers, schools and regions. NECTA is continuing to rank but the rewards are reduced – it is assisting but not as much as previously. Motivation is still there but it is more on the individual now.”
Regional Education Officer“The EP4R approach is helping the government to align domestic resources to results. PS Nzunda is talking about performance-related pay/promotion for teachers. He has gone after the LGAs with the message better performance = more resources.”
EP4R Technical Team Leader
“Without incentives – I can’t guarantee the good practices will continue. I think for some LGAs they will for others they won’t.”
Regional Education OfficerBRN! did not have an explicit focus on private schooling. Private education has not historically been encouraged in Tanzania due to the prevailing ideology to public goods initiated by President Nyerere at the time of independence. Private schools were included within the BRN! targets and, after some discussion, it was agreed that some private schools would be recognised through the school incentive scheme (although excluded from monetary rewards).
• At the time of BRN! (2013 to 2016) approximately 6% of all primary schools and 30% of all secondary schools were privately owned and operated. This meant that there were about 900 non-government primary and 1,200 non-government secondary schools in Tanzania enrolling over 650,000 students each year. Most of these schools were found in urban areas.
• Private school results were included in the BRN! targets and it was agreed that a limited number of the most improved nongovernment primary and secondary schools could participate in Annual Education Week and be awarded certificates of improvement by the President. These private schools were not eligible to receive the financial rewards given to government schools through the school incentive scheme.
• No government resources were allocated to enable private schools to achieve BRN! targets or to participate in BRN! related activities.
• Following the election of President Magufuli in November 2015 and the introduction of fee-free education the Ministry of Education was instructed to introduce a fee cap to regulate the level of fees which could be charged by non-government schools. After researching this topic and looking at international examples of the effectiveness of fee caps this idea was dropped but efforts were then made to tighten the inspection and regulation of private schools. This led to the development of a quality assurance framework which subsequently influenced EP4R’s efforts to improve quality assurance in government schools.
BRN! focused attention on improving early grade learning outcomes and exam results. While the quality of approaches introduced to achieve this varied, the high level focus and publicity itself led to local adoption of strategies and a renewed emphasis on learning outcomes.
Some schools moved ‘better’, experienced teachers who usually taught the higher grades to the early grades to improve learning of those children
The performance rankings were a motivation both ways. Schools prominently displayed certificates. One region had a ‘worst performing district’ trophy to shame that district into performing better next year and to warn others.
Regions and districts started to have conversations about learning and performance. When it was seen that EQUIP-T regions were performing better in early grades neighbouring regions asked what they were doing differently.
Districts and schools started talking about the number of children who ‘can’t do the 3Rs’. While the measurement for this was unclear, there was a genuine effort to reduce the number and track progress.
Schools identified pupils that couldn’t read and developed their own ways to support them. Some removed them from lessons until they had caught up on the basics. Some ran extra catch up classes before school or in breaks.
Successes
• Improvements in information flows between districts, region and national government.
• Increased transparency over school performance and relative examination results.
• Enhanced focus on student attainment.
• Local problem solving and innovation as Districts made efforts to achieve BRN! targets.
• Resource mobilisation to schools through introduction of direct to school capitation grant system in 2016 following its prior inclusion as a BRN! priority activity area.
• Culture change and capacity development within Ministry of Education staff working on BRN!
Weaknesses
• Structural weaknesses of education delivery system remained unaddressed.
• Issues with consistency of Presidential engagement and willingness to tackle financing gaps.
• Lack of connection between some BRN! activities and improved educational outcomes.
• Absence of focus on falling primary school attendance levels which were not identified as an issue during the Delivery Lab.
• Limited community engagement and use of ‘bottom-up’ accountability mechanisms.
• Uneven willingness and inconsistent incentives to drive change combined with lack of dedicated capacity to drive necessary system improvements.
• The delivery approach was used to rapidly implement post-EVD recovery plan priorities in multiple sectors. Focus of the PRP was on implementing a time-bound plan through multiple actors rather than a longer-term education system reform approach.
• The President’s Delivery Team (PDT) led a highly structured and effective implementation unit, working alongside Ministries, though not always through them.
• Robust prioritisation, planning, reporting and accountability structures but largely driven by external technical assistance not system actors.
• Very strong system leadership from President, Chief of Staff and Ministers developed effective accountability framework but system ownership of processes was more limited which lead to them disappearing at the end of the PRP period.
• Strong results in relation to key initiative goals in a very challenging environment but limited impact on improving system functionality or ability to deliver without external technical assistance.
• Exam results improved during the period of PRP, but this was coming from a low-post EVD base and so some of this improvement can be accounted for by natural system recovery. The PRP initiatives themselves did not have an explicit focus on improving learning outcomes.
• The President’s Recovery Priorities (PRP) were introduced by the Government of Sierra Leone in July 2015. Focus on education, energy, governance, health, private sector development, social protection and water.
• The programme was intended to drive sustainable socio-economic transformation in Sierra Leone following the twin shocks of the Ebola Virus Disease and falling commodity prices.
• The PRP was overseen by the President’s Delivery Team (PDT), led by the Chief of Staff, and was characterised by a tight weekly performance management regime.
• The education initiatives ran to July 2017 and focused on the production of lesson plans, classroom construction, school approvals, payroll verification & school feeding. Each initiative was headed by a Director within the Ministry of Education, Science and Technology (MEST) supported by the Recovery Coordinator/Deputy Minister’s office.
• Each Director had a Working Group which met weekly to review progress with the Minister and Recovery Coordinator - reporting to the President every Thursday.
• Proved successful in delivering short-term results. Given capacity constraints in Sierra Leone the focus was on putting in place the essential ‘building blocks’ of the system (approving schools, understanding how many teachers there were etc.) rather than on learning outcomes.
• Delivered largely by external funding from DFID & other development partners and through embedded international Technical Assistance from McKinsey, AGI ASI & Cambridge Education.
System Leadership & Commitment
Prioritisation and resourcing
Very strong
• Strong sustained commitment from President, Chief of Staff and Ministers
• Weaker system leadership at lower levels of the education delivery system.
Strong but external
Data, information and routines
Strong but external driven
• Clear prioritisation and analysis of costs/gaps
• Prioritisation involved a small number of people and was led by external Technical Assistance.
• Relied almost exclusively on external finance
Understanding of delivery and problem solving
Strong within the PDT team, weak elsewhere
• Robust structures and flow of data for progressing monitoring established using UNICEF RapidPro and other external data sources.
• Strong PDT team analytical capacity
• Analytical capacity also existed within Districts through externally recruited TA teams.
• Very externally driven through the strong PDT teams at national and district level
Communication, accountability and culture change
Very little cultural change
• Communication on priorities and expectations clear
• System problem solving ability not really strengthened, particularly at local levels.
• PDT and District TA teams analysed and problem solved with each other or with external actors such as NGOs so there was little genuine capacity building.
• PDT almost a parallel system, not really strengthening system and potentially even making it more dependent of external input
Examination results improved to a peak in 2017 (end of PRP delivery phase) but have dropped off in 2018. There are mitigating circumstances for both the 2018 fall however, and the 2017 improvements which make it hard to make any definitive conclusions about gains in learning outcome during the PRP.
• Data in the graph opposite indicates that examination pass rates did improve during the PRP period which ended in 2017.
NPSE (Primary)
BECE Pass Rate (Junior Secondary)
• These improvements were achieved in a post-disaster (EVD) situation and were coming from a low base. For BECE there were larger increases between 2014 (48%) and 2015 (61%) than during the PRP period (72% in 2017). Similarly for WASSCE which increased from 7% to 15% from 2014 to 2015 and then from 15% to 20% between 2015 and 2017.
WASSCE Pass Rate
- University level pass (Senior Secondary)
• Conversations with Ministry officials indicate that they feel the decline in exam results in 2018 may have been impacted by reform efforts including:
• A clamp down on leakage of papers and better monitoring of exam centres
• A switch to ‘conference marking’ to reduce corruption by individual examiners
• In addition schooling was significantly affected in 2018 by the election (and Presidential run-off) with schools closed and teachers absent for a significant period. This is likely to have affected exam preparation with teachers having various election-related roles in local areas.
• Sierra Leone was emerging from the devastating impact of the Ebola Virus Disease (EVD). There was a recognition from Government and donor partners that a focused effort was needed to help the recovery to be fast-tracked.
• The recovery plan was therefore to be supported by a Presidential Delivery Team (PDT). This was an external cross-sector delivery approach that supported rapid implementation of priority recovery activities.
• PDT was supported by the Government at the highest levels and by the major donor partners.
• The focus was not on systemic improvement but on the rapid delivery of recovery priorities. In education this was about getting pupils back into schools and solving some of the most immediate issues impacting poor learning outcomes.
Sierra LeoneThe PDT team worked alongside the Government delivery pathway, but also operated its own internal pathway for reporting and problem solving. This was a delivery approach that was heavily driven by a temporary, external team backed up by very strong system leadership.
PDT Central team
• At the top of the PDT process was an accountability and oversight process led by the very highest levels of government and requiring regular reporting to the President and his Chief of Staff.
Central Ministry
District Government
PDT Ministry Team
• PDT had an external delivery structure that operated in partnership with, but external to, the existing Government system. This included a central PDT team working with State House, Ministerial teams working with senior Ministry Officials and District Teams working with district government and implementing partners.
PDT District Teams
• This was not a small team focusing on a few delivery priorities while normal service continued elsewhere, this was a re-focussing of the whole education system around some core objectives for a short (2 year) period to achieve some clear results.
President, State HouseStrong system leadership was critical to the success of PDT in education and other sectors. The level of commitment, ownership and activity from the President, Chief of Staff and Ministers for a sustained period enabled PDT to develop an accountability mechanism which had impact on officials within the Ministry of Education, Science and Technology.
• There was commitment from the very top on a regular basis (weekly) for a sustained period (2 years) and this enabled an accountability system to develop which changed the working culture in the Ministry of Education, Science and Technology.
• This central leadership in turn led to sector leadership, with Ministers and Deputy Ministers being held to account for the progress in each sector on a regular basis.
• In addition to this system leadership, a highly functioning external support unit operating alongside existing systems at all levels acted to ensure day-to-day focus, progress and regular flows of information across the system.
• “My role was to be ‘after everyone’s neck’, to ensure the reports where ready on time to present.”
Produce and disseminate primary lesson plans in core subjects
Clear teacher payroll and recruit/re-allocate capable teachers to fills skills gaps in schools
Provide teacher training for all primary and secondary teachers
Assess and action 500 applications from nonapproved schools. Revise and communicate minimum standards for approval.
Provide school feeding for 1.2m children in Government and Government-assisted primary schools to encourage pupils to return to school and improve attendance levels
Strengthen the capacity for regular, school-level data
Construct 1,000 classroom in the most crowded schools and equip them with furniture
Construct, rehabilitate or validate 360 water facilities and 375 latrines in schools
The prioritisation process was evidence-based and thorough, looking at the available evidence from across the country. The planning process was very centralised within PDT though – plans were the same in each district except variations of scale (in terms of quantitative targets) and identity of implementing partners.
“Regional consultations were more like presentations. It was to inform people about the approach and plans rather than to consult.”
PDT District FacilitatorPDT used available evidence to conduct analysis of the problems to support prioritisation and develop robust and implementable plans. Deeper local-level problem solving to unblock existing systemic delivery barriers were led by District TA Teams who were externally recruited and reported to the central PDT.
• Using available data the PDT team identified priority initiatives that aligned to the overall recovery goals.
• Most of these initiatives involved effective implementation plans, management of external technical assistance or implementing partners and systemic oversight of progress.
• Fewer initiatives actually involved supporting the Ministry to resolve problems which were causing barriers to effective service delivery. Where these were in place they were often led by external partners e.g. payroll verification.
• Focus on delivery chain blockages at district level were led by external District TA Teams where the role of local government was focused on overseeing and reporting on implementing partner progress.
“There were regional consultations, but these were presentations not discussions. They were presented for validation really. Some local Ministry officials were not always happy with the plans.”
PDT District Facilitator“Within each district the local people were involved. Problems would be solved locally, it is only big issues that would be escalated to the Ministry. This was very seldom.”
Former Deputy MinisterGoSL Staffing External Staffing Alignment of Budget and Delivery priorities
State House President and Chief of Staff
External PDT team, heading by Yvonne Aki Sawyerr.
International (McKinsey) and National team members.
Rigorous cost analysis aligning priority actions identify costs, available funds and gaps in funding. But large proportion of funding from donor partners (particularly DFID) not GoSL
“There was some resentment in the districts. The PDT team had vehicles, fuel, top up (for phone calls) and computers. They didn’t get any of that until the end and the project assets were disposed of by being given to them.”
PDT District FacilitatorNational LevelMinistry
Minister, Deputy Minister, Chief Education Officer, Directorate Heads (initiative leads), data staff
Districts District Council staff, District Education Office staff
PDT liaison person for education.
Rotating support from McKinsey staff (3 month rotations)
Local interns
PDT team of 3 – District Facilitator, Data Analyst and Community Engagement Officer.
Implementing partners
ICT resources to report and logistical resources to call/visit
As above – strong align of priorities with cost analysis against funds available rather than recurrent Ministry of Education budget
At this level it was ensuring effective spend against plans. There was some resentment from Government officials lacking IT/logistic equipment and finance about the resources the PDT had access too.
Assessment of staffing and resourcing indicates that considerable funds were spent establishing external structures at Presidential, Ministerial and District levels. The actual resources available to Districts (rather than implementing partners) were limited.
PDT established and maintained strong performance monitoring and management routines which were at the heart of the success of the delivery approach. There were inter-connecting performance monitoring and management cycles at District, Ministry and State house level. Feedback would also come back down the system from the State House team.
District level:
• Weekly progress trackers updated and sent to central PDT team – data was sent from implementing partners or schools and verified through field visits
• Fortnightly district delivery team meetings– cross-sector meetings chaired by District Council Chair to report on progress and discuss on areas needed attention
• Sector working group meetings – weekly or fortnightly. Led by the district PDT team to prepare each sector for the district delivery team meeting.
National level:
• Weekly Monday Management Meeting – all directors and initiative leads meeting with Minister, Deputy Minister, Chief Education Officer and PDT Lead for Education
• Every week – working group meetings for the different initiatives
• Weekly progress tracker – compiled from district reports
State House level
• Weekly Presidential Forum – each Ministry would report to the central PDT team and the President’s Chief of Staff. Often (fortnightly or monthly) the President would chair and Ministers would be expected to present.
“The biggest change was improved co-ordination on certain topics. Before the left hand didn’t know what the right hand was doing and so couldn’t shake.”
Dr Thorpe, former Deputy Minister of EducationData
Description of approach to data and information
Data on implementation progress at district level
• Classroom completed to various stages
• Lesson plans distributed
• WASH facilities constructed
• School feeding practice
• Varied depending on the initiative – some weekly, some monthly, some quarterly
• Submitted by implementing partners, schools,
• Verified by district teams (PDT and GoSL)
Situation room
• Community monitoring of above indicators, e.g. classroom constructed, school feeding, pupil attendance
• Collected multiple times a month
• Reported on monthly
• Community data collectors
• Aggregated by District/national
Situation Room Team
District, National and PDT level performance trackers
• Using data from above 2 sources to populate weekly progress trackers Weekly
District PDT Team, National PDT Team & Ministry of Education
“We wrongly assumed district M&E staff would have the data needed. Bu there was a big disconnect between District Ministry and HQ staff. Districts didn’t have infrastructure or logistics to send get data and send regular updates. PDT helped with that.”
PDT district facilitator
Ministry and PDT joint monitoring
• Joint monitoring visits for each initiative to view progress and challenges.
One primary and one secondary initiative each month
National PDT and Ministry joint team
and reporting was central to the PDT approach. Better data was collected and verified and it was used to populate agreed performance trackers to communicate progress.
Sierra Leone is still heavily dependent on donor funding and external technical assistance. As the chart below shows, while the Government led most initiatives it did not fund many of them or act as the main implementation agency. This reinforces the view that PDT was more about effective project management to deliver short-term results. rather than sustainable system reform.
“I would say, over 70% was partner implementation, for example the development of lesson plans and construction. There was very little Ministry input at district level beyond monitoring. Some ministries not happy about partners implementing instead of them.”
PDT District FacilitatorPRP was established primarily to achieve rapid results. It did little to change the system, but it did merge into the Leh Wi Lan programme which took on some of the outstanding results areas and has moved on to more system strengthening activities.
• PRP’s approach was to prompt cultural change from within the system by establishing structures (staffed largely through external TA) to increase focus on accountability, data and results.
• The weakness of existing national and district level delivery structures meant that the ambition of cultural change was not matched by the level or duration of capacity building support for existing district staff. The need to achieve short-term results over-rode local level involvement or capacity building.
• PRP did evolve into the Leh Wi Lan programme which carried on some of the initial focus areas but is now looking more to a system strengthening approach, working with district education officials and schools, as the country moves beyond the post-EVD recovery phase.
“The district officials were not always cooperative, they thought we were taking over their work. PDT was very demanding and they didn’t want to do that and saw us as bossy. Some Sector heads were not very cooperative – they didn’t provide accurate data, they didn’t have updates, they didn’t attend meetings. That’s when central PDT intervened.”
District Data Analyst for PDTPRP was set up to achieve rapid improvements in priority areas. It used tools and processes backed up by senior level access to increase accountability in relation to specific results. Since PRP has ended it does not seem to have built lasting changes to systemic accountability.
• The PRP used robust planning and performance monitoring tools combined with externally appointed people at national and district levels to drive improved accountability for implementation.
• This was backed up with regular oversight from the Chief Minister and the President.
• This seemed to help improve the speed of implementation and temporarily improve the accountability for progress (or lack thereof) but it proved less successful in changing the systemic accountability structures after PRP.
• Part of the reason for this appears to be the reliance on an almost parallel performance monitoring team and reporting chain that helped improve implementation speed, but at the risk of bypassing and possibly weakening already very weak existing delivery systems and accountability structures.
“The indicators in some areas were going from ‘amber’ to ‘red’ due to inaction. A few times we reviewed implementation timelines and the traffic lights would go back to green. We did this on a couple of occasions. It didn’t really address the reasons why we weren’t making progress in those areas.”
PRP did not aim to support the private school system but did support government-assisted schools, which encompasses a large variety of different types of schools e.g. community schools and those established by religious organisations and NGOs.
• PRP interventions focussed on Government and Government-assisted schools. It did not have any specific interventions targeted at private schools.
• Government-assisted schools cover a variety of ‘non-private’ schools that receive some assistance from the government in the form of finance, materials and personnel but which also receive finance from elsewhere and also employ non-payroll teachers to make up the necessary staffing levels. This includes community and mission schools which are not fully owned by the Government but are considered part of the public system (they are all now included in the Free Quality School Education Programme, unlike private schools).
• Government schools account for 35% of primary schools, with Government assisted schools making up 55% of the total (38% mission and 17% community). Private schools account for 10% of primary schools, rising to 12% for Senior Secondary.
• There are some in the Government of Sierra Leone who view the private school sector as undermining the public system by employing teachers who are on the Government payroll so that many spend more time teaching in private schools than in the Government schools they are allocated to. This also helps to highlight weaknesses in the teacher management system and is a particular issue in Freetown.
• Basic building blocks of the education system were delivered through effective project management including: 400 new classrooms constructed with a similar number of schools equipped with upgraded WASH facilities; school approvals process streamlined and improved meaning that 500 additional schools now receive funding; payroll verification exercise conducted which identified over one thousand ‘ghost’ workers and numerous errors in existing records; school feeding introduced, meaning that tens of thousands of primary students received meals twice per week; Situation Room rapid data collection system introduced using volunteer monitors- data was collected monthly by MEST using mobile phones.
• Reliance on external assistance and recognition that the existing education delivery system was so weak that parallel processes had to be established for data collection, analysis and performance management. This meant that there was relatively little impact on the education system once PRP ended.
Punjab started using the delivery approach in 2011 when the Chief Minister began implementing the Education Roadmap with technical support from Michael Barber and colleagues.
• The initial focus of the Roadmap was on ensuring that the basic foundations of the system were in place: i.) pupil enrolment and attendance, ii.) teacher recruitment and attendance, iii.) textbook availability in schools and iv.) basic school facilities. Impressive progress was made across all four areas during the early years of the Roadmap.
• Success was achieved through relentless and regular monitoring of indicators (including school level visitors by monitors) with data displayed in disaggregated heatmaps which were scrutinised by the Chief Minister during regular stocktake meetings. Officials were held to account and fired for poor performance.
• The approach was subsequently expanded to consider a range of other issues including learning outcomes. This led to some loss of focus as the number of initiatives and areas being monitored grew.
• Six-monthly learning assessments carried out by DFID and McKinsey Roadmap team show a significant improvement in literacy and numeracy at Grade 3. These findings have been widely questioned however, including by former members of the Roadmap team, who noted that the reductive focus on 15 learning outcomes which were tested monthly meant that teachers taught to these tests and not the wider curriculum as a whole.
• The change of government and Chief Minister in 2018 did not bring an end to the Roadmap but there has been a shift in some areas such as a reduction in support for the Punjab Education Fund (PEF). Much of the Roadmap monitoring infrastructure, technical and analytical advice relies on external funding from World Bank and DFID.
The Punjab Education Roadmap commenced in 2011 and was initiated by technical advice from Sir Michael Barber and his team under the leadership of the Chief Minister Shabhaz Sharif. The Roadmap was designed to tackle a series of long-standing and significant problems in Punjab’s education system including high levels of out-of-school children, poor teacher attendance, a lack of basic facilities and infrastructure and a shortage of textbooks. Punjab also had poor learning outcomes- with 53% of Grade 3 children unable to read a sentence in Urdu in 2011.
• The immediate focus of the first phase of the Roadmap (2011 to 2014) was to tackle these basic systemic issues by addressing accountability and ‘fixing’ inputs. In line with the principles of deliverology the Roadmap focused on four easily measurable priorities: i.) student enrolment and attendance; ii.) teacher recruitment and attendance; iii.) textbook availability and iv.) basic facilities. The Roadmap also channelled funds through the Punjab Education Foundation (PEF) to enable poor children to attend low cost private schools for free at the point of use through a voucher scheme.
• The Roadmap was led by the Chief Minister who held regular stocktakes (initially monthly and then bi-monthly) to review progress. Data was gathered at school level using a network of 900 District Monitoring Officers (DMOs), mostly ex-army, who had first been recruited in 2007 under the World Bank’s support programme. The Roadmap was able to make use of this World Bank infrastructure to drive delivery and it was the Programme Management and Implementation Unit (PMIU), originally established in the Government of Punjab to coordinate and oversee World Bank support, which became the key mechanism for collecting and analysing data providing the foundation for stocktakes.
• Regular stocktakes were at the centre of the accountability mechanisms established by the Roadmap. Executive District Officers who failed to deliver against targets and trajectories could be dismissed and this was transmitted down to school level where the 900 DMOs made independent monthly visits to 60,000 schools, changing the schools regularly to avoid routine. This created fear amongst officials and led to a sharp decline in unauthorised teacher absence and a culture whereby achievement of Roadmap priorities over-rode everything else.
• From 2014 after the approach had undoubtedly proved successful in establishing accountability routines and basic indicators (but with no real measurable impact on actual teaching practices or learning outcomes) the Roadmap team sought to expand the priorities and focus on improving learning and classroom practices. Initially it proved difficult to change classroom practices, due in part to a lack of clarity over what data to use, and focusing on exam results as targets in what had become a high-stakes accountability system led to gaming and cheating.
• In 2015 the Roadmap team started to measure learning in classrooms as the DMOs administered a sample of literacy and numeracy questions to pupils during their school visits. Learning results improved in the areas which were being tested but teachers focused on improving these narrow set of learning outcomes rather than teaching to the curriculum. It is therefore questionable as to the extent to which this approach led to improvement in general teaching practice and learning outcomes.
• The change of government and Chief Minister in 2018 did not bring an end to the Roadmap but there has been a shift in some areas such as a reduction in support for the Punjab Education Fund (PEF). Much of the Roadmap monitoring infrastructure, technical and analytical advice relies on external funding from World Bank and DFID.
System Leadership & commitment
Excellent
• Chief Minister’s absolute commitment to holding bimonthly stocktakes.
• Transmission of this commitment down to school level.
Prioritisation and resourcing
Very strong initially, more priorities added as approach proved successful weakened focus.
Data, information and routines
Very strong
• Focused clearly on 4 key metrics which were easy to measure on a monthly basis.
• Resources available to support implementation from World Bank and DFID.
• Whilst the delivery approach persists following the change of government there is less clarity than before as to its priorities and objectivesperhaps inevitable as it started 8 years ago.
• When approach proved successful more priorities and indicators were added, leading to a dilution of focus.
Understanding of delivery and problem solving
Strong understanding of issues on inputs etc. Weaker on improving learning
Communication, accountability and culture change
Strong accountability structures and good understanding of priorities. Culture change within classrooms?
• Accurate monthly data collected, analysed and returned to schools very promptly each month.
• Data fed into clear accountability routines headed by bi-monthly stocktakes.
• Good understanding of constraints to provision of basic inputs.
• Understanding of issues strengthened over time as Districts became familiar with the approach and carried out their own problem solving.
• Accountability was a real shock to the system which led to changes in behaviour.
• Merit based recruitment helped develop a cadre of talented district officials.
• Permanent shift in public culture and expectations about teacher attendance and the need to eliminate multi-grade teaching.
• Cost of c.900 DMOs still covered by external sources.
• Were the things being measured the ‘right’ things for quality long-term improvement?
• Weaker understanding of the fundamental shifts needed in teacher classroom practices or behaviour to impact on learning outcomes beyond the narrow metrics being measured.
• External TA led much problem solving.
• Accountability was top-down rather than bottom-up. Focus on specific outcomes distorted incentives, overtime parallel structure can erode accountability.
• Less clear that there has been a culture change when it comes to teacher performance or classroom practices.
Initially the Roadmap focused on inputs and accountability before shifting to have a more explicit focus on learning outcomes from 2014 onwards. The Roadmap team found it difficult to measure learning outcomes at a school level in a meaningful way utilising the ‘heatmap’ and ‘stocktake’ approach used for teacher and student attendance. Their solution to this was to use the DMOs to administer a sample of tablet-based literacy and numeracy questions to pupils during their school visits. The results from these assessments are set out below.
• Literacy and numeracy scores showed relatively little improvement for the first year and a half of testing but then showed more significant improvements, particularly in 2017-18.
• Whilst these results are impressive there are concerns about the extent to which they genuinely represent improved learning.
• As the focus was on improving basic skills the tests were designed to focus on 15 learning outcomes- significantly narrower than the curriculum.
• According to former members of the Roadmap team, teachers then began to focus on improving results for this narrow set of learning outcomes rather than teaching to the curriculum. In effect they were teaching to the test (and a relatively narrow test at that). In some schools teaching of the curriculum stopped and lessons were dedicated to improving literacy and numeracy assessment scores.
SOURCE: Independent Six Monthly Assessment Results as conducted by DFID and McKinsey Roadmap Team in 300+ schools selected randomly on a 6 monthly basis
Whilst there were undoubtedly widespread improvements in literacy and numeracy against the tested set of learning outcomes it is questionable as to the extent to which this approach led to improvement in general teaching practice, coverage of the curriculum and learning outcomes more widely.
“ We were always looking through a delivery lens i.e. what could we measure to monitor progress and show improvement. This meant we ended up creating metrics that can show improvement, but not necessarily the right metrics for quality improvement of the system in the long term…. We made this mistake in Punjab because we were under pressure to improve results in the 3 month cycles that delivery focused on.”
Former member of the Punjab Roadmap team
It is clear that the Punjab Roadmap delivered tangible and apparently long-lasting changes in a number of critical aspects of the education system. Tellingly these changes are most apparent in those areas (student attendance, teacher absence, primary enrolment) which were an initial priority for the first phase of the Roadmap from 2011 to 2013.
• It is clear that the Punjab Roadmap achieved significant results and that many of these achievements took root within the first year of implementation (as noted in Sir Michael Barber’s 2013 essay ‘The Good News from Pakistan’).
• Significant progress has been made in increasing student attendance, reducing unauthorised teacher absences and increasing primary school enrolment.
• Interestingly data from 2017 show that the increased levels of primary enrolment have been driven by i.) a transfer of students from private schools to government schools, perhaps as perceptions of quality increase; ii.) increased retention in schools as fewer students drop-out and iii.) new enrolments and a reduction of out-of-school children.
• Success was also achieved in the production and distribution of lesson plans and textbooks, the provision of basic facilities in schools and the merit based recruitment of teachers.
• All these results areas lend themselves well to a delivery approach because they are relatively simple to measure and track on a monthly basis. The one area where the Roadmap struggled to make progress was on school construction where targets were consistently not met.
“With school attendance you can see quick outcomes- it lends itself well to the delivery approach.”
Former member of the Punjab Roadmap team
“The Roadmap certainly got the system working, more children are going to schools and education is on everyone’s lips, it is a priority.”
• The Roadmap has its origins in DFID’s support to the Government of Punjab and Pakistan’s education sector.
• DFID arranged for Sir Michael Barber, former head of the UK’s PMDU to meet with Chief Minister of Punjab, Shahbaz Sharif, in 2010. This led to the development of the Punjab Education Reform Roadmap in 2011.
• The motivation for the Roadmap was the Chief Minister’s desire (shared by DFID, World Bank and others) to address the critical issues of poor teacher attendance, the high number of out-of-school children and lack of basic facilities and textbooks which were constraining the performance of Punjab’s basic education system.
• The Roadmap did not establish new structures but instead built on what already existed – utilising the World Bank initiated Programme Management and Implementation Unit (PMIU) in the Government of Punjab and the network of District Monitoring Officers (DMOs) who were already in post.
• Technical advice was provided to strengthen these structures through embedded support from a small number of McKinsey staff funded by DFID.
PunjabBi-monthly stocktakes with Chief Minister
Programme Management Implementation Unit (PMIU)
36 Districts, led by Executive District Officers (EDOs)
• The Punjab Education Reform Roadmap did not establish new structures (at least not during its initial phase) but made use of the existing Programme Management Implementation Unit (PMIU) in the Government of Punjab which was first established to coordinate World Bank support. The Roadmap TA team did operate as a new ‘team within a team’ in relation to the PMIU.
• The Roadmap provided technical support to enable the PMIU to work effectively, making use of its connection to 900 District Monitoring Officers (DMOs) who collected monthly data from schools but who were managed outside the line management structure of Districts thus making them independent.
• The bi-monthly stocktakes with the Chief Minister were at the heart of the approach, creating a forum where the School Department and Executive District Officers were held accountable for performance.
900 District Monitoring Officers (DMOs)
60,000 Government Schools
• As the diagram illustrates the stocktake, PMIU and DMO structures ran alongside the government education system and acted as an independent source of advice, analysis and to hold the system to account.
• From 2014 the Government of Punjab also created a Strategic Monitoring Unit. This lead to some confusion as to how this unit and its accountability mechanisms operated alongside the Roadmap and the PMIU which also had priorities outside of the Roadmap.
The unwavering commitment of the Chief Minister of Punjab, Shahbaz Sharif, to the Roadmap was one of the most important factors behind its success. The Chief Minister ensured that he made time to participate actively in the bi-monthly Roadmap stocktakes and he ensured that government officials were held accountable for performance.
• The Chief Minister was a very powerful figure in Punjab. He was focused on achieving results and commanded great respect and fear. The Roadmap put his personal authority at the centre of its approach and used this as a mechanism to genuinely hold officials to account for performance throughout the system. He was particularly insistent on seeing ‘rapid change’.
• The key to achieving this was to develop regular data collection and analysis routines with feedback loops linked to the personal intervention of the Chief Minister through bi-monthly stocktakes.
• It is important to note that there was sufficiently strong analytical and technical capacity within the Punjab civil service and education system to respond to the requirements of the Roadmap. There were issues in some of the Southern districts of Punjab where officials were held to account for poor performance which was due in large part to a lack of resources. This was recognised and resources were then directed to these districts.
• Over time there was some dilution of focus as the Roadmap team and the new Strategic Monitoring Unit began to monitor and report on a wider range of priorities, some of which were less well suited to the production of accurate bimonthly data which could be used for performance management.
“Everyone knew that getting something big done in Punjab meant engaging Shahbaz Sharif and ensuring he drove it forward.”
Sir Michael Barber‘The Good News from Pakistan’
“Accountability was very much driven by the Chief Minister, he was powerful and nothing would have got set up without his leadership.”
The first three years of the Roadmap were focused on building accountability and fixing inputs (teacher attendance, facilities, infrastructure and textbooks). The second phase of the Roadmap focused on improving education quality, using a six step model of continuous improvement of teaching and learning.
• Whilst it is right to note that most of the funds for Roadmap implementation came from the Government of Punjab the vast majority of these were committed to recurrent costs such as staff salaries. Punjab was fortunate to benefit from significant funding from DFID and the World Bank over the period from 2011 to 2019 which could be used relatively flexibly to address emerging issues, pay for technical assistance to implement the Roadmap and cover the costs of the District Monitoring Officers. Without these DFID and World Bank resources there would have been insufficient funds to implement the Roadmap effectively.
The establishment of regular and reliable monthly data disaggregated by districts and schools was absolutely critical in operating the Roadmap. Without this data the stocktakes would not have been able to meaningfully address district level issues.
• The Roadmap benefited from the presence of 900 District Monitoring Officers (DMOs) who were equipped with motorbikes and were managed independently from the district education system. These DMOs, initially using a paper-based system but later upgrading to tablets, were able to gather monthly data from school visits on teacher and student attendance and the presence of basic facilities and inputs. They never the visited the same school twice within a six month period to reduce the risk of collusion.
• Monthly data was sent to the PMIU where the Roadmap team assisted in ensuring that there was an extremely rapid turnaround time from the data being received (10th of the month), analysed (15th) and sent back to districts with feedback (20th) so they could see their relative performance.
• Visualisation of this data was also important so that the ‘story’ it was telling was immediately accessible and clear to the Chief Minister during stocktakes. Heatmaps of relative performance by district were produced along with districtlevel disaggregation of specific issues (see slide opposite on vacant teaching posts).
• The DMOs are still employed and are gathering data from the field on a monthly basis. Their costs are covered by the World Bank. Initially the DMOs were mostly ex-army but now they are mainly graduates of Lahore University of Management Sciences (LUMS).
The initial approach in Punjab was centralised in that targets and priorities were agreed centrally (on the basis of available evidence) and then ‘pushed down’ to Districts. Given the scale of the issues and need for pace this approach was understandable. Over time greater focus evolved on encouraging local problem solving within Districts although much of the analysis was externally driven by the Roadmap team.
• Initially targets were set for districts and they then had to respond to these targets through actions led by Executive District Officers (EDOs) and their officials and schools.
• Data soon showed that some districts were consistently performing worse than others so the Roadmap team invested time in working with these districts to understand constraints and issues- this led to additional resources being directed to poorly resourced districts.
• Over time there has been a gradual shift towards more localised planning and problem solving within districts but this is still conducted within the overall framework of centralised target setting.
• Whilst districts are being encouraged to solve their own issues and develop localised solutions most of the technical capacity to conduct analysis is still externally driven from the Roadmap team or PMIU.
• Whilst progress has been made on improving inputs and attendance this has not really been the case when it comes to improved classroom practices and genuine learning- suggesting that the understanding of these input based issues is stronger than comprehension of the activities which are needed to bring about genuine quality improvement at a local level.
The Roadmap relied on the power of top-down accountability to achieve results; accountability which originated with the Chief Minister and which was transmitted down to Executive District Officers, headteachers and schools. This accountability did prove effective in achieving results where the indicators were clear and could be reported on regularly.
• Failure to make progress against the four main performance targets (teacher attendance, student enrolment, basic facilities availability and textbook availability) had significant consequences for Executive District Officers (EDOs) at the bi-monthly stocktakes. Failure could lead to summary dismissal although after a while there was a greater understanding of the impact of local constraints on performance so, for example, more funds were allocated to Southern districts.
• Merit based recruitment of EDOs was also an important factor in accountability and culture change as individuals were now working within a performance framework (albeit a tightly defined and fairly narrow one) where achievement of results was rewarded. This was a quite significant shift.
• This top-down accountability led to culture change at a school level where the proportion of teachers who took unauthorised absence dropped considerably. This accountability was made possible through the monthly visits of DMOs who were independent from the system. Harsh sanctions were imposed for non-attendance.
• Issues started to arise however when the top-down accountability mechanisms which had been used so effectively for basic inputs were expanded to cover more complex issues such as learning outcomes where the link between activities and results is much less clear. This led to ‘gaming’ of the system and a focus on achieving performance metrics at the exclusion of all else- meaning that the targets drove the system and not in a beneficial way.
• The Roadmap does seem to have brought about a permanent shift in culture (amongst stakeholders in the education system as well as parents, communities and the public at large) about the importance of teacher attendance and school enrolment. This is a highly significant achievement which should be admired. It is questionable as to whether there has been a similar shift in expectations about quality of education or teacher performance although there has undoubtedly been a shift in enrolment from private to government schools.
“The stocktakes were an act of theatre in a room full of fear. People were publicly humiliated and sacked over the phone.”
Former DFID Pakistan Adviser
“There has been a permanent change in public expectations in some areas such as teacher attendance and the elimination of multi-grade teaching. But these are simple to measure things. We didn’t shift expectations on teacher performance or learning outcomes. ”
Former Roadmap Team Member
“Delivery should be used for catalysing short-term, measurable, quantitative changes and not for complex, qualitative change…Punjab was not a delivery success story for me. It was a lost opportunity for me.”
Former Roadmap Team Member
Former Roadmap Team Member
“Delivery helps to focus on an underserved area – to put the spotlight on it. But if you are using delivery to try to fundamentally improve system accountability then you have a problem. If the system is not holding itself accountable, then you have a core problem to address. Delivery doesn’t address that. It is avoiding it by creating an artificial parallel system”
Support to the Punjab Education Foundation (PEF) was a key element of the Roadmap approach. PEF is an independent agency with a mandate to provide funds to support the development of low cost private schools in Punjab. The Roadmap embraced this approach from the start and channelled funds through PEF to enable poor children to attend low cost private schools for free at the point of use through a voucher scheme. This proved very effective as the figures below indicate.
• The Roadmap recognised the potential importance of low-cost private schools in expanding access to quality education. The Roadmap thus worked with PEF to expand two of their programmes: i.) the New Schools Programme which invited non-government or private providers to set-up new schools where government provision was non-existent or inadequate and ii.) a voucher scheme which gave money to poor families so that they could purchase a place at a private school. The private school had to agree to regular student assessments to measure progress.
• Both these programmes expanded significantly with Roadmap support, lending themselves well to the results-oriented approach focusing on attendance and expansion (easily measurable priorities which also worked effectively in the government-run parts of the Roadmap).
“PEF is one area where the new government has taken a different approach since 2018. They have cut its funding significantly as they saw it as being very closely associated with the previous administration.”
Former DFID Pakistan Adviser
“The delivery approach is agnostic to issues like the private sector, it comes down to a choice of strategy and what you are trying to achieve within that sector.”
Former Roadmap Team Member
• Although PEF did face some challenges due to this rapid expansion (including ensuring that the vouchers were genuinely targeted at the ‘poorest of the poor’ and the timely release of funds) it is fair to say that it was a Roadmap ‘success story’.
• Interestingly as the Roadmap progressed enrolment trends in Punjab showed a shift away from the private sector and back towards government schools. Enrolment in government schools had shown little change between 2013 and 2015 but then increased rapidly in 2017. Analysis from the Roadmap team showed that about 1/3 of this increase in government enrolment was from pupils transferring from private schools, perhaps as a result of increased confidence in the government system.
14,15,16,17
The Quality Improvement Programme was adopted by Haryana in 2014, building on the achievements of the RELEP programme and trying to take system improvements to scale and enhance quality education.
• Technical support to develop the Quality Improvement Programme was provided by Boston Consulting Group (BCG) from 2013 whilst the Michael & Susan Dell Foundation provided financial support for implementation which commenced in early 2014.
• The approach did not look to establish new structures but instead worked through the Directorate of School Education focusing on four actions: i.) zeroing in on an ambitious goal and tracking progress; ii.) digging deep to unearth big systemic problems; iii.) imposing discipline to design scalable interventions and iv.) using new technology to implement innovative initiatives.
• The Quality Improvement Programme was then converted into the Saksham Haryana movement in 2017 (part of the national Saksham programme) and adopted its performance targets.
• Provisional ASER data shows that Haryana has seen improvements in literacy and numeracy between 2014 and 2018 but that these improvements have not been exceptional.ASER data therefore suggests that, on these metrics, Haryana has not outperformed other States which did not adopt the delivery approach.
System Leadership and commitment
Fair
Prioritisation and resourcing Medium
• Chief Minister’s high profile oversight of Saksham Haryana including commitment to hold high profile events at Block level.
• Frequent changes of leadership during early years of programme hindered effective implementation.
• Clear vision of success defined as improvements in learning outcomes.
• Most activities funded from recurrent budget for sustainability with funding from Michael & Susan Dell Foundation for programme management.
• There are a relatively wide range of activities and it is not immediately clear as to which are most significant in improving learning.
• Much of the approach involves large-scale inservice teacher training where international evidence suggests impact is mixed.
Data, information and routines
Very strong
• Strong focus on measuring learning outcomes for all students through effective data systems.
• New technology used to enhance MIS.
Strong
• Evidence from CCE and LEP RCTs used to inform programme design, commitment to understanding systemic issues.
• Evidence that measuring data alone doesn’t improve outcomes. Understanding of delivery and problem solving
• Increasing local problem solving and engagement with schools on Saksham Haryana.
Communication, accountability and culture change
Medium
• Learning outcomes now prioritised with a strong focus on measuring learning of individual students on a regular basis.
• Strengthening of inspection system with more regular visits and focus on learning.
• Initial approach of developing QIP was quite centralised although analysis broadened over time.
• Initial difficulties in communicating with schools for programme management.
• Less focus on systemic change through reforming areas such as pre-service training etc.
• Government schools have improved results faster than private schools but enrolment in private schools is still rising (unlike Punjab).
There have been improvements in literacy and numeracy since the introduction of the Quality Improvement Programme (QIP) in 2014. India’s ASER report compares literacy & numeracy in Standard V government schools across all States. The latest ASER report shows that Haryana has seen improvements in literacy and numeracy between 2014 and 2018 but that these improvements have not been exceptional.ASER data therefore suggests that Haryana has not outperformed other States which did not adopt the delivery approach. Haryana did show a greater rate of improvement between 2012 and 2014 but, as QIP implementation commenced in early 2014, it is not clear the extent to which this is due to either QIP or prior interventions.
According to the 2018 ASER report (published in January 2019) Haryana was ranked 4th out of 18 states in 2014 on Standard V reading and had fallen to 5th in 2018, ranking it 11th in terms of percentile improvement (2014-2018-from 53.9% to 58.1%) out of the 18 states. On Standard V division Haryana was ranked 5th in 2014 and 3rd in 2018, and ranked 12th in terms of percentile improvement (2014-2018 from 30.8% to 34.4%). ASER data therefore suggests that Haryana has not outperformed other States which did not adopt the delivery approach.
The ASER Haryana-specific report for 2018 shows that there have been significant improvements in reading and arithmetic in government schools since 2014 for children in Standard 3 and 5 but that there has been a decline in performance for children in Standard 8.14 Whilst private schools continue to score higher than government schools, the Standard 3 and 5 performance of private schools has generally remained static since 2014, failing to mirror the increases achieved in government schools.
Arithmetic: Between 2014 and 2018 the proportion of Standard 3 children in government schools who could do subtraction increased from 24% to 31.6% whilst in private schools it declined from 74.7% to 70.7%.
The proportion of Standard 8 children in government schools who could do division declined from 50.7% to 49.1% between 2014 and 2018 whilst in private schools it declined from 86.1% to 76.8%.
Reading: Between 2014 and 2018 the proportion of Standard 3 children in government schools who could read a Standard 2 level text increased from 21.7% to 33.5% whilst in private schools it declined from 61.5% to 56.1%.
The proportion of Standard 8 children in government schools who could read a Standard 2 level text declined from 78.4% to 73.4% between 2014 and 2018 whilst in private schools it declined from 93.5% to 88.7%.
In addition to the improvements in literacy and numeracy achieved since the introduction of the Quality Improvement Programme (QIP) and set out on the previous slide, there have also been several other positive achievements in Haryana’s education sector.
• Innovative use of technology including the widespread use of tablets for conducting lessons.
• The introduction of Saksham Registers through which teachers maintain a log of student’s learning levels, focusing on children who are significantly behind grade level. This enables them to monitor their learning progress on a weekly basis and target lessons to tackle student’s specific needs. Teachers make use of Saksham Adhyapak which is a dashboard based on competency-linked assessment developed for teachers to conduct student assessments six times in a year.
• These regular assessments of students’ progress are then supported by quality assurance from the school inspection system. Every government school gets an inspection visit at least once every two months. Inspections are random and focus on academic monitoring during which students are selected at random to answer questions on an academic monitoring ‘pro forma’.
• Relationships between schools and parents have been strengthened through Saksham Parent-Teacher Meet (PTM) sessions which are organised for each Block of schools for parents of pupils in all classes.
• The Haryana state government believes that, as of 2019, over 65% of students are grade-level competent in Classes II, V and VII. 26 out of 81 blocks have been declared Saksham (capable) i.e. reaching 80% at grade level as assessed by Gray Matters (3rd party verification agency).
• Large scale events are held under the direction of the Chief Minister where Blocks are declared Saksham. The Chief Minister’s involvement is important in publicising achievements and reinforcing the importance of education.
•
Haryana was looking to build on the achievements of the Learning Enhancement Programme (LEP) developed by Pratham. LEP sets aside a segment of the day for classes aimed at student’s literacy and numeracy levels rather than their grade level. A randomised controlled trial, published in 2015, found that while there was no significant impact on maths test scores, LEP did have a large effect on students’ basic Hindi skills.15
• Whilst Haryana had made good progress on access, quality improvements had largely been driven by NGO programmes rather than a holistic approach led by the state government.
• Technical support to develop the Quality Improvement Programme was provided by Boston Consulting Group (BCG) from 2013 whilst the Michael & Susan Dell Foundation provided financial support for implementation which commenced in early 2014.
• The approach did not look to establish new structures but instead worked through the Directorate of School Education and took a holistic, integrated and results-oriented approach.
• The Quality Improvement Programme was then converted into the Saksham Haryana movement in 2017 (part of the national Saksham programme) and adopted its performance targets.
The approach did not look to establish new structures but instead worked through the Department of School Education which was responsible for overall implementation and coordination of inputs from others.16
• The Government of Haryana wanted to ensure that the Quality Improvement Programme (QIP) was driven by the core Department with responsibility for school education.
• A Program Management Unit (PMU) was established which reported to the Department of School Education. The PMU was staffed by advisors from the Government of Haryana and Boston Consulting Group.
• The specific activity workstreams under QIP were led by initiative teams which were headed by Government administrators- this helped to build government ownership whilst also ensuring that team members from different organisations or parts of Government could be brought together to share expertise.
• QIP also involved support from implementation partners in specific areas whilst funding for program management was provided by the Michael and Susan Dell Foundation.
The Chief Minister has a personal interest in ensuring that all school Blocks are declared ‘Saksham’ (meaning that 80% of pupils have been externally verified as achieving at grade level). This involves visible leadership and commitment in holding public celebration events when Blocks achieve Sakhsam status.
• The Chief Minister’s office oversees the Saksham programme directly at the Block level, meaning that it is clear to all involved that this is a high profile priority.
• Young professionals have been hired under Chief Minister Good GovernanceAssociates (CMGGA) to coordinate activities in each district and report directly to Chief Minister’s office.
• Despite this visible interest from the Chief Minister’s Office, responsibility for actually delivering the programme lies with the Department of School Education.
• Coordination and communication between the Chief Minister’s Office and Department of School Education is important to ensure that a ‘joined up’ approach is being taken to implementation and so that a unified front is presented to Districts and Schools.
• During the early years of the programme it was hampered by frequent changes of leadership in the State government but the current Chief Minister provides visible and committed leadership.
The QIP/ Saksham Haryana approach relies on the production and analysis of regular data on student’s learning levels to drive performance improvements. The programme has a very explicit focus on tracking learning outcomes (rather than inputs) across 15,000 schools.
• Saksham Registers are used by teachers to maintain a log of student’s learning levels, focusing on children who are significantly behind grade level. This enables them to monitor their learning progress on a weekly basis and target lessons to tackle student’s specific needs. Teachers make use of Saksham Adhyapak which is a dashboard based on competency-linked assessment developed for teachers to conduct student assessments six times in a year.
• These regular assessments of students’ progress are then supported by quality assurance from the school inspection system. Every government school gets an inspection visit at least once every two months. Inspections are random and focus on academic monitoring during which students are selected at random to answer questions on an academic monitoring ‘pro forma’.
• An integrated MIS replaced multiple individual IT systems and hosts data on 2.2 million students and 100,000 teachers. This helps to enhance accountability as well as generating administrative efficiencies.
• A Randomised Controlled Trial (RCT) was carried out by 3ie on CCE (Continuous and Comprehensive Evaluation- in which high-stakes exams are replaced with more frequent assessments of learning by teachers) and LEP (Learning Enhancement Programme in which teachers set aside a segment of the day for classes aimed at student’s literacy and numeracy levels rather than their grade level) which were two precursor programmes to QIP. This RCT found that CCE (i.e. focusing on regularly measuring learning levels) had no impact on pupil’s learning outcomes either operated in isolation or in combination with LEP. This suggest that there is not a strong evidential basis in Haryana that measuring learning outcomes improves performance- certainly not compared with student-level instruction which did prove impactful. Of course by collecting accurate data on learning outcomes it is then easier to determine student’s working levels.
The QIP had an explicit priority of improving learning outcomes across 15,000 schools. During the design of the programme BCG helped to ensure that QIP focused on four actions: i.) zeroing in on an ambitious goal and tracking progress; ii.) digging deep to unearth big systemic problems; iii.) imposing discipline to design scalable interventions and iv.) using new technology to implement innovative initiatives.
The QIP roadmap was built on the four principles set out above and was designed to have three core intervention areas.
1. Learning outcomes- develop clear focus and accountability for students’ results via learning assessments, school inspections and greater community engagement.
2. In-school interventions providing tools and training for classroom-based and in-school improvements. Provision of resources such as new textbooks and revised, effective training and mentoring for teachers, especially to cater to first-generation learners with multiple-grade learning deficits.
3. Systemic interventions to strengthen the education system including creating a management information system (MIS); organisation structure changes and capability-building; and provision of sufficient teachers through school consolidation or redistribution.
The main school-level activities to improve performance have taken the form of large-scale In-Service Training (INSET) which is a fairly standard intervention in educational improvement programmes worldwide. In Haryana this was linked to regular assessments of learning outcomes and the development of an MIS to enable this data to be used for performance management purposes.
Haryana benefited from external funding from the Michael & Susan Dell Foundation to help support the programme management of QIP. However the majority of programme interventions were designed to be operated with government funds to ensure sustainability.
The QIP roadmap developed in 2013 envisaged a holistic approach which would address a range of systemic issues to ensure the effectiveness of school-level initiatives to improve learning outcomes. The intention was that these interventions would lead to sustainable long-term transformation.
• The development of the QIP roadmap was initially a relatively centralised process although it was able to draw on evidence and lessons learnt from the CCE and LEP programmes (where comprehensive RCT data was available from 2014). It had an explicit intent to base initiatives on an understanding of the root causes of systemic issues.
• The systemic transformation elements focused mostly on MIS and improving use of data on learning outcomes- there was not a focus, for example, on initial teacher training.
• As the programme developed and transitioned into Saksham Haryana there was greater emphasis on school level autonomy to develop solutions to issues.
• Communications with 15,000 schools and teachers proved problematic at the start of the programme but the use of WhatsApp groups, SMSs and mobile-compatible online forms helped improve communications and enhance understanding of local delivery issues. 17
QIP and Saksham Haryana have undoubtedly ensured that there is a far stronger focus on learning outcomes than had previously been the case in the State. There is also a much greater emphasis on monitoring individual pupils’ learning levels on a regular basis and using this data to measure progress.
• The high profile declaration of Blocks of schools which have achieved Saksham status has also raised the public profile of reform efforts and ensured that parents and communities have a greater understanding of efforts being made to improve learning. Despite this awareness and progress enrolments in private schools continue to increase.
• Whilst learning outcomes are now clearly a system priority it is less clear as to the extent to which there have been fundamental and lasting changes in teacher accountability and performance management. Data exists through the integrated MIS to manage performance but the achievement of improved learning outcomes is a fairly complex process which would be problematic to link too closely to performance management systems.
• Interactions between parents and teachers in government schools has improved through the conduct of Saksham parentteacher-meet (PTM) sessions and this is a positive development.
• The development of ‘Saksham’ as a clearly defined status which all schools have to achieve has helped to provide a purpose for school improvement which is easily understood and externally verified.
• The inspection system now visits government schools at least once every two months and has an academic pro-forma to assess learning attainment.
• Haryana has based its interventions on the hypothesis that monitoring pupil learning on a regular basis will lead to performance improvements. Whilst this makes intuitive sense the RCT conducted on the CCE suggests that, in Haryana, this hypothesis may not hold. Activities have generally focused on large-scale in-service training where global evidence suggests that the impact on learning outcomes is mixed. More fundamental changes to PRESET have not taken place.
The main focus on QIP/Saksham Haryana is to improve learning outcomes in government schools and to strengthen the government education system. However, as a significant proportion of children in Haryana are enrolled in private schools, performance in these schools is also an important contributor to improved learning outcomes.
• The ASER Haryana 2018 data presented earlier in this report shows that, whilst private schools score higher than government schools on reading and arithmetric metrics, there has been a fairly significant improvement in government schools at Standard 3 and Standard 5 between 2014 and 2018 whilst there has been a slight decline in performance in private schools over the same period.
• This data may suggest that QIP/Saksham Haryana has succeeded in making improvements to learning outcomes in the public school system which have not been replicated in the private school system.
• Interestingly, despite these improvements in learning outcomes in government schools, private enrolment has continued to rise as a proportion of children in school. This data is presented in the chart opposite and shows consistent rises in enrolment across all Standards and years apart from Standard 2 where there was a slight reduction in enrolment between 2016 and 2018 (although the 2018 figure was still higher than that for 2014).
Ethiopia’s delivery unit is very recent and has struggled to make an impact to date after the Minister who supported it’s establishment left office.
• The delivery process began in late 2017 so is still relatively new and is currently struggling to find its place within what is quite a complex education delivery system.
• Political commitment came from former Minister who left in late 2017. System leadership has weakened since he left and external donor interest is maintaining the delivery units
• Too many and too complex priorities have presented barriers to achieving quick results.
• The delivery unit has issues with leadership, resourcing and accountability given the context of Ethiopia’s Federal structure.
• The approach is recognised by regional officials as having potential to drive change, but it is not yet realising this potential.
• The recent provision of illuminating data may be a catalyst to re-awaken the approach –this includes new data showing that learning outcomes are even lower than assumed.
System Leadership and commitment
Prioritisation and resourcing
Initially strong then faded
• Minister initially strong supporter, but after change of Minister some momentum was lost.
• No recognition of Regional Leadership (Regional Presidents) in an autonomous Federal structure
Unclear priorities
Data, information and routines
Started then stopped
• Budgets available for all but school feeding
• Technical assistance beginning to build momentum
• Priorities vague and ambitious – tough to achieve
• RDU staffed by 1 person with another job already
• Initial energy and enthusiasm
• Energy quickly faded due to lack of interest.
• Difficulties in gathering data given multiple priorities.
Understanding of delivery and problem solving
Centralised planning
• Plans with clear milestones have been produced.
• Limited local input or ownership in plans.
• Not enough contextualisation for very different regions.
Communication, accountability and culture change
Weak
• Communication of goals/targets has been relatively effective.
• Limited engagement in system strengthening
• Promises to MDU/RDU staff made and not keep has led to resignations.
The initial learning data coming from schools does not really reflect the impact of delivery due to how little it has done to date. The data generated has provoked a renewed interest in the potential of the delivery approach to achieve change.
• It is too soon to assess the impact of delivery on learning, but those involved with it say that due to low activity it can’t have achieved much so far.
• Monthly learning assessments from schools have been generated from the data improvements the delivery team has been working on. This is seen by the team as a baseline more than evidence of delivery impact.
• The data is showing that learning outcomes are a long way below the targets set in the plans. This data has engaged the Minister again and he is now looking to delivery as a datagenerating process to inform Ministry action.
“We are collecting data against delivery priorities to track progress. The results are very poor. The progress against priorities is very limited but this is unsurprising given lack of any implementation. In fact I would say any gains can’t be attributed to delivery at all. This data is a useful baseline though.”
MDU Delivery Advisor“REB heads and Ministry of Education were very excited about the learning assessment results. The are expecting to take action and more motivated about delivery now. The data played a key role in helping leadership shift their mind. They knew the problem, but didn’t know the extent of the problem. They were shocked.”
MDU Data Analyst• The former Minister of Education (Dr Shiferaw) was impressed with what he had read and heard about delivery approaches elsewhere and wanted to bring them to support education reform in Ethiopia.
• Dr.Shiferaw had a willing partner and backer in DFID who through their QESSP project supported the setup of a delivery approach through technical assistance and the provision of staff and resources to the Federal ministry.
• Work on delivery started in mid-2017 and Dr.Shiferaw was replaced (to become Ambassador to South Africa) in late 2017.
• In addition to the change of Minister, the Prime Minister also changed in 2018.
Ethiopia
Deliverology
Ethiopia has two levels of delivery units to fit within the Federal structure. One working within the Federal Ministry of Education and a second level attached to Regional Education Bureaus
• Due to the Federal system in Ethiopia there are delivery units at Federal and Regional level.
• The MDU sits within the FMoE and is staffed by a mixture of seconded Ministry officials and consultants financed by DFID.
• The role of the MDU is to oversee implementation of plans in priority areas, collect reports on progress, follow up on problem areas and report to the Minister.
• The Regional Delivery Units are essentially a single person that has taken on the role of ‘regional delivery focal person’ in addition to their existing jobs
• The regional delivery focal person is meant to support delivery plan implementation coordination across the REBs, collect data on progress and report this to the MDU.
Deliverology in Ethiopia began with strong system leadership but political change and Ethiopia’s complex Federal structure has quickly eroded that initial momentum.
• Under Dr.Shiferaw as Federal Minister of Education there was strong leadership for the delivery approach from the very top. This led to high initial momentum, energy and commitment to the approach.
• After the change in Minister, attention was placed elsewhere and senior leadership interest and commitment began to wane.
• The change in Minister came very early in the delivery process and so practices and routines were not well established which meant they easily fell away.
• On top of this, the delivery approach had been a Federal initative. It had involved Regional Education Boards, but not Regional Presidents and therefore when interest from the Federal Minister waned, it was easy for Regional Education Boards to also look elsewhere as there was no encouragement from their Regional Presidents to continue to embrace delivery.
• Leadership on delivery is currently being provided by DFID and they are committed to continued engagement through the next phase of their technical assistance support.
• The current Minister is more concerned with implementing the current Education Roadmap rather than the delivery approach. Recent conversations with the MDU about the new data on learning outcomes coming from the delivery team has spiked his interest in delivery again.
“The State Minister for general education has been very keen on delivery – he understands the need for these principles. But he doesn’t have the power to deal with key things and the REB heads can ignore him more easily.”
MDU Delivery Advisor“The culture here is that everyone looks to who is above them. REB Heads are looking at the Minister. The Minister is now not talking about delivery at all and so nor are they. He stopped focusing on it when the Delivery Associates team left. The former minister was pushing delivery heavily but he moved on soon after we started.”
MDU Delivery advisorPriorities were developed during a series of Delivery Labs that focused on different regions and sections of the education system.
• Labs were initially led by Delivery Associates consultants and later by British Council consultants embedded within the MDU. Not led by FMoE employees.
• Vast number of hypotheses tested which contributed to some vague priorities.
• Significant variations in the situation and needs in different regions led to an approach where 6 Federal priorities for all were chosen, but an additional 3 priorities were selected by each region.
• Nine priorities is a large number of priorities on top of everyday activities for Regional Education Bureaus.
• Some of the priorities chosen are very complex and ambitious and without adequate resources and time seem too hard to achieve.
• It may have been more effective to identify some areas where problems were less technically challenging and required a focus on improving delivery systems as a way of embedding the process and illustrating the potential for success.
“There are so many problems. In the labs the participants wanted to solve all the problems. They were not used to a prioritisation process and it was difficult to do so quickly”
MDU Delivery AdvisorSome strategies are national and some are regional. The idea is activities are not new, just a change to the way we operate. We just implement in different fashion.”
“
RDU Focal PointImproving the use of existing lesson plans in classrooms
Building capacity of teachers through active and needs based coaching
Improving instructional leadership in schools
3 additional regional strategies
Examples include:
Data availability
Improving parent engagement in the learning process of students .
1. Improving student learning with consistent attendance and basic nourishment through the school feeding program
2. Ensuring all teacher staffing gaps filled through sustainable sourcing strategies
3. Ensuring all primary schools have the basic minimum facilities required
Improving data availability on literacy and numeracy assessments, implementation progress and impact.
Driving accountability and performance through better use of data.
Hypotheses tested
Students have access to schools
Adequate number of schools exist within reach
Kids are enrolled in schools and they attend Learning content is effective
Adequate number of pre- and primary schools exist (i.e. KG, O- class, and G1-8)
Schools are easily accessible and within reach of students (pre- and primary both)
Kids have been enrolled in schools
Students attend school regularly and are ready to learn
Curriculum is simple and allows sequential learning at a decent pace
Textbooks are linked to curriculum and effective in explaining learning objectives
Textbooks are available and being utilised
Teacher guides are effective at explaining learning objectives
Teacher guides are available and being used by teachers
Adequate numbers of teachers exist
Schools provide quality instruction Leadership and accountability in schools exists
Teachers are qualified and effective in delivering instruction
Teachers attend school regularly
Teachers have necessary qualifications and skills
Teachers training on content and pedagogy is effective
Teachers are not motivated / do not apply what they learn
Internal exams / tests are conducted frequently, without cheating
Students have a conducive environment in schools / home to learn well
Assessments are periodic and are effective in testing student learning
Basic facilities exist and are maintained
Parents / wider community are active participants
School / Zone / Woreda level leadership is effective
Exams / tests are linked to curriculum and test application, not rote learning
Exam results are used to tailor instruction and provide remedial support regularly
External exams are conducted to assess and benchmark student's across schools
Classrooms are adequate in size and there is no overcrowding or multi-grade
Water, electricity, sanitary facilities and furniture are available
Parents value the importance of education
Parents provide support at home
Parents ensure children are nourished and healthy to attend school and learn well
School leadership exists / attends schools
School leadership is effective (e.g they give feedback and perf. manage teachers)
Zone/woreda leadership visits schools and actively coaches / manages school leaders
Data on key metrics is collected
High quality of data is collected and progress against KPIs is tracked to drive accountability
Data is reliable
Data is collected frequently
Data is used to track progress against KPIs, identify root causes and their solutions
Staffing and resourcing indicate delivery units and their priorities were not sufficiently supported within the system
Ministry Staffing External Staffing Alignment of Budget and Delivery priorities
Federal (MDU) FMoE education staff recruited but retained old jobs and never fully transitioned to focus on MDU.
External consultants from Delivery Associates (for 9 months) and British Council delivery and data consultants embedded within the MDU.
Ministry staff were promised career recognition and pathways for their work in the MDU, but this didn’t happen and so some just returned to previous positions.
“The people appointed to the MDU were from within the Ministry and had other jobs to do as well. They were promised additional benefits and that the MDU work would link into the Ministry career structure but this didn’t happen. Many have returned to their old positions only and don’t work with the MDU any more.”
Regional (RDU) One member of staff named ‘regional delivery focal person’ but also expected to continue to do existing job.
No external support meaning implementation was reliant on a small existing team which in some regions is very weak. Very limited capacity building provided to date.
Detailed plans but no alignment of Regional budgets with delivery plans. Sufficient budget exists for all plans except school feeding though.
No additional budget and easy to deprioritise RDU activities.
“The RDU had no formal position and no connections into the formal REB/FMoE structures. It had no additional budget and therefore had no activities of its own beyond reporting.”
Leaders in some regions are very weak. There are more capable people around, but due to cultural/regional politics the leaders have to be from the region, not the most capable person.
MDU Member RDU MemberData collected on implementation progress
Regional reports on progress: Every two weeks RDUs had to update their traffic light report on progress and send to the MDU. But, the Regional Delivery Units are so demoralised that the last traffic light data from them was sent 3 months ago.
Reports from Minister to PM every two weeks
RDU progress reports every two weeks –traffic lights
Monthly Reports from MDU to Minister: This report went to Minister, State Minister and DFID. Happened until a few months ago. But it stopped without any consequences.
National stocktakes: In December and January 2018 after the labs there were a few stocktakes. These stopped because there was no real leadership commitment to them. In addition when they did happen and follow up activities were identified for regions falling behind these were often ignored. This seems to be due to the autonomy of Regions and the fact Regional Presidents were not involved, so Regional Bureau Chiefs felt they could ignore the Federal Ministry.
Monthly stocktakes at National Level
Follow up from MDU with RDU
Reports to Prime Minister – Prime Minister initially wanted updates every two weeks and wanted to know which regions were underperforming but with a change in political leadership this interest stopped. The current Prime Minister is focused on the Roadmap.
Independent school-level digital data collection on deliveryfocused KPIs.
MDU receives the data directly, cleans, analyses and presents data pack to various levels
Data from 4,000 schools and 70,000 pupils
• Learning assessments in English, Mother Tongue and Maths
• Time on task
• Use of lesson plans
• Instructional leadership
• Pupil engagement
Monthly at the moment but will review regularity required for each type of data
50% by independent data collectors and 50% by Cluster Supervisors
“The plan is to collect 3 months data at this point. We needed to test the software, survey design and instruments. We have showed we can collect data and we are now reviewing what we should collect on different basis – this is part of the process.
Independent data collection can not be the solution long-term. There is not a major difference between data collectors and cluster supervisors so cluster supervisors should do it. Maintaining the data system is quite important to change the system functioning.”
Poor existing data systems meant there was no clear way of regularly monitoring performance indicators and the data section of reports would be empty. Recent data improvements have generated some interest and renewed energy.
MDU Data Analyst
Delivery labs considered a huge number of issues, but the understanding of delivery systems and context-specific problem solving in relation to priority areas appeared to be limited.
• Some of the priorities selected are vague and complex. This seems to stem from the challenges of trying to focus on a limited number of priorities whilst not having the discipline to prioritise.
• This led to a lack of clarity on the priorities from the beginning and a planning process which appears to have been heavilyled by a external technical assistance team which was only in place for a short period.
• This approach provided structured plans, but ones which did not understand local delivery systems in depth and so did not adequately solve problems across a variety of contexts.
• Regions were left to implement generic delivery plans based on a centralised model, rather than on an understanding of the specific regional contexts which vary significantly in Ethiopia.
“There were challenges at all levels.
In the REB there was an attitudinal challenge. They see deliverology as additional work. They didn’t approve strategies. They need time, planning and budget.
There was also resistance at school level. The use of new lesson plans is harder work than traditional way.”
RDU Focal Point
Due to the rapid nature of the delivery approach and the desire to produce action plans, front line workers seem to have been recipients of generic plans rather than contributors to developing contextualised solutions.
• Regional stakeholders were heavily involved in Delivery Labs but then seem to have been presented with generic implementation plans, despite very different contexts.
• There does not seem to have been any significant engagement of front-line workers in developing solutions.
• Lack of ongoing support for Regional Education Bureaus meant generic plans were used leading to a lack of ownership and contextual appropriateness.
• Rather than delivery being used to identify and unblock delivery obstacles, delivery is being used to implement centralised implementation plans with little local ownership or buy-in.
“There was an assumption that the Regional Education Bureau (and REB head) would support the work of the RDU and implementation of the plan. This was a false assumption because the RDU had no formal position in the REB, it had not connections into the formal REB/FMoE structures and it had no additional budget. It therefore had no activities of its own beyond reporting.”
MDU Delivery AdvisorThe approach taken this far appears to present a paradox. The aim appears to be connected to cultural change as the priorities include big, complex challenges but the approach has been resourced as a short-term linear intervention.
• The approach has been to try to establish delivery units largely operated by existing staff and therefore prompt cultural change from within the system.
• The priority areas are vague but hugely ambitious and challenging and indicate a system reform/cultural change aim.
• However, this ambition is not yet matched with the level or duration of support for regions and so fast-track cultural change has been attempted with limited contextual appreciation, local level involvement or capacity building.
“Real leadership and ownership key, but also there needed to be more of an appreciation of the change management required – a better understanding of existing context and culture.
The thing that works most like delivery in Ethiopia is funerals – they are very focused, tasks are prioritised and they are efficient! We could have used that example. There needed to be a slower approach, building piece by piece. Not cutting and pasting from elsewhere. At the beginning there should have been a focus on a few key principles. We needed to work with existing and established systems. Change happens quickly in Ethiopia and so previously fashionable initiatives are easy to ignore.”
MDU Delivery AdvisorDeliverology in Ethiopia initially had senior leadership backing and engagement. But a change of Minister and a lack of engagement at regional level has led to minimal impact on accountability structures in Ethiopia
• The delivery approach in Ethiopia was promoted by the previous Minister for Education. He brought energy and ownership and there was an opportunity to focus on improved accountability for results. In the beginning meetings were called to discussed which regions were doing well and which ones weren’t.
• When he left early in the process deliverology was left without a senior internal champion. The current Minister had other priorities and the delivery unit became like “a lost department” according to one person involved.
• The way the regional delivery units were setup also seems to have limited the impact of accountability. The Federal level does not have total control within the regions and the Regional President (to whom the head of the regional education bureau reports) has a lot of influence. They were not involved and so REBs did not take action and delivery had very little influence on accountability for performance.
• Many of those appointed to RDUs have left their posts and gone back to previous jobs.
“The new Minister was initially interested but then this began to decline. When the external consultants [from Delivery Associates] left it really dropped off the radar. The Minister is not talking about delivery now and so neither are the REB heads”
“We used to send a monthly report to the Minister, State Minister and DFID on updates. I wasn’t getting any response anymore, so I stopped – and nobody said anything or asked me why I wasn’t doing it anymore.”
“We used to report every week, now it is declining. There is no longer a demand from above.”
MDU Member MDU MemberThe deliverology approach in Ethiopia did not set out to work with private schools in any way. Government funding of education in Ethiopia is much higher than in many neighbouring locations.
• The approach has focused on supporting the public education system
• Given the use of core priorities across all regions and the limited number of private schools in many regions this has not be an area of focus for the Deliverology approach in Ethiopia.
• Private schools are predominantly located in Addis Ababa and other large urban centres.
• Ethiopia’s spend on education as a proportion of government expenditure is well above SSA averages – showing there is a commitment to support the public system with public finance.
• Despite this, donor contributions still account for 10-20% of sector spend in recent years.
• Prime Minister’s Delivery Unit (PMDU) established by Tony Blair in 2001. Led by Sir Michael Barber (2001-5), Ian Watmore (2005-7) and Ray Shostak (2007-10).
• Relatively small unit (never more than 40 staff) reporting directly to the PM. Focused on a small number of key outcomes which were a real priority for the PM and his Government, including education and skills.
• Worked across Central Government to establish joint performance measurement and accountability structuresshifting focus on to delivery of results as well as formulation of policy.
• Conducted short (eight week) and confidential problem solving reviews with government departments to understand and address key delivery challenges.
• Moved beyond simple target setting and evolved greater focus on understanding delivery systems and the levers (hard and soft) used to deliver outcomes.
• Achieved some notable successes in education and skills (e.g improving the performance of London schools, expanding apprenticeship opportunities etc.).
• Abolished in 2010 following the election of a new Coalition Government. Re-established in 2012 as the Implementation Unit.
• A structured and analytical approach that enables teams to both understand a system and what is blocking achievement and, based on evidence, identify solutions that unblock what is getting in the way.
• A powerful way of both building common cause for change and bringing decision makers information, analysis and recommendation for improvement efforts.
• Done in partnership with key Ministry, Department or Agency.
• Narrow in scope- solving a particular problem.
• Cost neutral- focus on improving what is there.
• Based on analysis of current situationwith a focus on variance (learn from the best and promote it and eliminate the worst)
• Always test hypotheses with frontline fieldwork.
• In 2003 London had the worst primary and secondary results in the whole of England.
• PMDU asked to work with Department for Education & Skills (DfES) in 2002 to improve performance of London secondary schools.
• This led to the introduction of the ‘London Challenge’.
• There were 3 main objectives:
i.) to raise standards in the poorest-performing schools;
ii.) to narrow the attainment gap amongst London pupils;
iii.) to create more good and outstanding schools.
3 main activity strands:
i.) identifying the worst performing schools and assigning an adviser to each school to help develop tailored solutions.
ii.) identifying the local authorities of most concern and assisting them to develop a transformational change ‘vision for school improvement’ and leverage additional resources;
iii.) investing in leadership and teaching across London including appointing ‘consultant heads’, introducing flexible payscales and pioneering Teach First.
By 2010 Ofsted rated 30% of London schools as ‘outstanding’ compared to 17.5% nationally, and very few London secondaries fell into the bottom Ofsted categories.
The improvement also included the poorest pupils, and by 2010 London had the smallest gap between the performance of children in receipt of free school meals and the average.
1. Understand and utilise the existing assets within the system (rather than having a deficit focus).
2. Maintain prioritisation and focus.
3. Importance of ‘authorising environment’ for rapid decision-making.
4. Invest in creating shared purpose and strong relationships.
5. Prominent role of committed ‘system leaders’.
• Performance Management & Delivery Unit (PEMANDU) established in 2009, responsible for overseeing the National Transformation Programme (NTP)- a set of high-level strategic priorities broken down into concrete activities and actions. Large Unit with over 130 staff.
• NTP implemented by Ministries, Departments and Agencies with PEMANDU monitoring progress and assisting in overcoming bottlenecks to progress.
• Malaysia’s Public Sector ‘culture’ conducive to performance management and results-focused approaches.
• Led by high-profile appointee from private sector (Idris Jala) who introduced ‘Delivery Labs’ as a means of bringing key stakeholders together to work intensively on detailed practical solutions to delivery issues.
• Held Open Days to publicise plans and invite critical comments from members of the public, the initial round of Open Days involved more than 20,000 participants.
• Introduction of ‘3 Feet Plans’ which are extremely detailed delivery plans monitored by PEMANDU on a weekly basis.
• PEMANDU was closed by the Government of Malaysia in 2018 and became a private consulting company, PEMANDU Associates.
Some of the key outcomes achieved through Malaysia’s application of the Delivery Approach in education included:
i. Dramatic success in expanding access to pre-primary education (see text box below).
ii. Improvements in important aspects of school performance at primary and secondary levels.
iii. Increased public understanding of education priorities and strategies for their achievement.
• Clear methodology- “Eight Steps of Transformation” (Strategic Direction, Labs, Open Days, Roadmaps, KPI targets, Implementation, Audit and KPI Target and Validation Process, Annual Reports).
• Focus on a limited number of well-defined priorities. Labs played an important role in creating ‘ownership’ of these priorities.
• Rigorous monitoring of KPIs through weekly reporting which is summarised in a ‘Minister’s Scorecard’.
• PEMANDU attracted top talent including those from the private sector.
• Strong focus on communications and stakeholder engagement.
• Heavy reporting burden on Ministries, Departments and Agencies with a complex institutional ‘ecosystem’.
• Suspicion within some parts of Government that PEMANDU are ‘outsiders’ pushing a private sector style performance regime.
• The quality of data for some KPI Indicators constrains the effectiveness of the performance management regime.
• Focus on specific targets means there is a danger of perverse incentives, gaming and ‘hitting the target but missing the point’.
• Designing the transformation programme through Labs meant that, in some cases, there was a lack of cohesion and coherence with wider Governments strategies and plans.
• Directly replicating PEMANDU’s structure and approach will have big cost implications- it is comfortably the largest Delivery Unit style structure in the world and its approach of using Labs and creating a complex institutional architecture carries a large cost burden. However this approach delivered benefits in Malaysia.
• High level political leadership is an important success factor.
• Granular performance management and monitoring regimes can be very effective. Focus on the quality and integrity of data.
• An effective public communications strategy is a shrewd investment of resources.
• PEMANDU has worked with governments in Tanzania, South Africa, India (Andhra Pradesh) and St Lucia. These engagements have had mixed success indicating there is a need for flexibility when dealing with different operational cultures and contexts.As an example the granularity of 3 Feet Plans may be appropriate in Malaysia but in Tanzania, where education service delivery is decentralised, the prescriptive nature of these plans meant that they rapidly became less relevant for holding civil servants accountable for progress.
• UKP4 was established in response to election campaign pledges made by President Yudhoyono to develop infrastructure, strengthen education, and increase business investment. Contextually this was a challenge because these pledges had to be delivered through a coalition government and the President’s policy office was understaffed.
• The Unit was modelled on the UK’s PMDU. President Yudhoyono had wanted to create it during his first term of office but it was killed by the Indonesian legislature as it had powerful opponents within government who saw it as a threat- these opponents were not so prominent during the President’s second term, enabling it to be re-created.
• The main purpose of UKP4 was to help ministries prioritise and achieve commitments by developing monitoring plans and tracking results- reporting on these quarterly rather than annually as had been the norm. It was headed by Kuntoro Mangkusubroto who was widely respected for leading tsunami reconstruction work in Aceh province.
• Kuntoro insisted on personally recruiting all UKP4 staff (with a staff strength of 16 initially rising to 35 by 2012), paying them higher salaries than the government norm and making them sign integrity agreements. He also maintained the power to fire them for non-performance and deliberately selected people from outside the government bureaucracy. However he could not persuade the President to abolish coordinating ministriesmeaning that there was an additional layer of oversight bureaucracy which could delay progress.
• UKP4 had a broad scope from the very start- producing 129 ‘action trackers’- one for each target project during the first 100 days of the administration. Ministers then had to sign contracts linked to these action trackers. By 2010 UKP4 was monitoring 369 action plans, a figure which had risen to 415 by 2012. This was too many plans for the small staffing of UKP4 to track effectively so they ended up monitoring about 20% of the plans quarterly.
• In 2011 UKP4 introduced a cell phone based complaints handling system to receive feedback from members of the public. This system proved imperfect as valid reports were mixed in with hearsay and rumours. Handling this system was just one example of how UKP4’s responsibilities broadened over its initial years of operation. Other examples included heading a task force on climate change and advising the UN on the 2015 successors to the Millennium Development Goals.
• Controversy amongst coalition members about progress ratings for various ministries meant that the President insisted that updates and results be shared with him alone (rather than at cabinet meetings as had been the case) so he could best decide how to handle issues.
• The head of UKP4 stated that he found that it became more difficult to assist the President and follow-up on progress when UKP4 was not aware of the details of the performance conversation and agreement between the President and each Minister following the shift to this confidential approach.
• UKP4 also acknowledged that narrowing the focus and having a smaller number of priorities to monitor would have made them more effective but that, given Indonesia’s political situation and heterogenous culture, this would have been very difficult.
• The DU model adopted by Colombia in 2014 was a fusion of the UK’s PMDU, President Clinton’s office in the US and specific recommendations made to Colombia by the OECD’s Public Governance Committee.
• Attached to the Presidency, the DU carried out 5 main functions:
i.) focused government attention on Presidential priorities;
ii.) coordinated government action to achieve these priorities;
iii.) supported rapid implementation and processing of related data;
iv.) provided technical support to understand issues and develop solutions;
v.) carry out monitoring of priority programmes.
• The DU built upon the National Management and Results Evaluation System (NMRES) which was the product of a decade of the Centre of Government’s work to improve accountability to citizens.
• The DU used the existing NMRES coordinating system to carry out their functions. A concept of monitoring performance using a president’s score card was introduced to ensure results were delivered (linked to the Government Plan).
• The DU was attached to the Presidency and led by the General Secretary to the President which ensured political commitment while the President was in office and committed to the process.
• Good system of monitoring, coordination and projection process leading to improved and accurate projections. The DU developed forecasting methodologies with tools for monitoring partial results to improve predictions in order to carry out corrections. This enabled government to anticipate and plan in advance because the model’s predictive power proved to be over 90 percent accurate.
• An example of good use of data is provided in the diagram opposite where the risk of delivery failure for a specific goal is assessed across five separate domains (feasibility of goal, resources, planning capacity, leadership and strategic clarity). This enabled the DU to identify potential areas of failure before they arise in order to take mitigating action.
• The DU monitoring system worked with a subset of 170 indicators which were refined into mega-goals (level of prioritising sector goals generated out of the analysed scored card); forecasting methodologies (tools for monitoring partial results to improve predictions in order to carry out corrections); and definition of specific plans (developing a chronogram and workplan).
• The Government of Ghana has developed an ambitious and comprehensive package of education reforms following the election of the NPP government in late 2016.
• Reforms focus on making extensive changes to almost all aspects of the education system including: introduction of a new pre-tertiary curriculum; introduction of fee-free secondary education; complete redesign of initial teacher training with the introduction of a practically focused B.Ed degree for all prospective teachers; teacher workforce reforms including changes to pay and performance management and the introduction of Teacher Licensing; introduction of a new school inspection system; TVET reforms, etc.
• Mixed progress was made on these reforms over the first two years- with some progressing well and others stalling. The Ministry of Education established a Reform Secretariat in December 2018 to oversee application of the delivery approach to reform implementation.
Diagram of Ghana’s Education Reform Structures• The Reform Secretariat has an initial staff strength of 8 drawn from outside the civil service. This team is working alongside a team of MoE counterparts. The first task for the Secretariat (which is being supported financially by a ‘Ghana Beyond Aid’ grant from DFID) was to develop simple Roadmaps and accompanying Key Performance Indicators (KPIs) for each of the 12 priority reforms. The Minister of Education then signed performance contracts with all reform owners in March 2018.
• In its’ initial stages the Reform Secretariat is focusing on instilling project and programme management rigour for key reforms- paying special attention to those which are highest risk such as the introduction of the new Pre-Tertiary Education Curriculum. Whilst some reforms are relatively easy to measure as they are concerned with national actions (such as changes of legislation) others involve 39,000 basic schools nationwide. Whilst these latter reforms may be more impactful they present challenges when it comes to the availability of regular performance data. The Secretariat will try to address this by working through existing systems wherever possible.
• The number of reform areas -12- is large and presents a challenge for ensuring that progress is driven effectively across all areas. Some reforms are owned by institutions which have proved reluctant and resistant to change.
• The Government of Ghana has used the delivery approach successfully in 2018 through the Teacher Education Reform Roadmap, produced and implemented with technical assistance from the DFID funded T-TEL programme. The Minister was impressed with the achievements of this programme and wishes to replicate the approach in other reform areasalthough the lack of dedicated funding for some priority initiatives is liable to be a constraint.
• The Secretariat is very new but has already worked with Ministry of Education counterparts to conduct a ‘deep dive’ investigation into learning outcomes and accountability at district and school levels. This analysis, published in January 2019, is being used to inform development of the World Bank funded ‘GhanaAccountability and Learning Outcomes Programme (GALOP).
1. “UNICEF Think Piece Series: Accountability”, Todd, R. 2018, UNICEF Eastern and SouthernAfrica Regional Office, Nairobi.
2. Hymowitz, D. (2016)’Too much science, not enough art’, Tony Blair Africa Governance Initiative
3. Shostak, R. et al. (2014) ‘When Might the Introduction of a Delivery Unit be the Right Intervention?’
4. Gold, J. (2017) ‘Tracking Delivery: Global Trends and Warning Signs in Delivery Units’, IfG.
5. Barber, M., (2008) Instruction to Deliver: Fighting to transform Britain’s public services
6. Panchamia, N. & Thomas, P. (2014) Public service agreements and the prime minister’s delivery unit, IFG.
7. “Delivery Units: can they catalyse sustained improvements in education service delivery?” revised paper presented at Delivery Units in Education seminar, London, 20 March 2014. Todd, R., Martin, J. and Brock, A.
8. “Big Results Now in Tanzanian education: Has the ‘Delivery Approach’ enabled teachers and delivered learning?” Paper presented at 14th International Conference on Education and Development, UKFIET, University of Oxford, 11 September 2017. Todd, R, & Attfield.I.
9. Taimur Khan Jhagra (2018) ‘Frenetic- Pakistan Education Story’
10.Barber, M. (2013) The Good News from Pakistan, Reform
11.Whelan, F. (2018)Assessing the potential of delivery to improve health and education outcomes in Nigeria, Acasus
12.Annual Status of Education Report (ASER) (Rural) 2018 provisional, January 2019, Pratham
13.World Bank (2018) Improving Education Sector Performance in Malaysia: Lessons from a Delivery Unit Approach
14.Haryana Rural ASER 2018, Pratham
15.Duflo, E. et al. (2015)A wide angle view of learning; Evaluation of the CCE and LEP programmes in Haryana, India, International Initiative for Impact Evaluation, 3ie.
16.Haryana’s Quality Improvement Programme-An Alternate View (July 2014) International Workshop on “Innovative Efforts for Universal Quality Education”
17.G. Batra, (2015) Cracking the glass ceiling in the Indian state of Haryana, Centre for Public Impact
18.Scharf, M. (2013) Translating vision into action: Indonesia’s Delivery Unit 2009-2012, Princeton University, Innovations for Successful Societies
19.Gonzalez, S. & Acosta, P. (2018) Lessons Learned from the Colombian Government’s Delivery Unit, Inter-American Development Bank
20.Yuan, K., Le, V-N., McCaffrey, D., Marsh, J., Hamilton, L., Stetcher, B. & Springer, M. (2012) ‘Incentive Pay Programs do not Affect Teacher Motivation or Reported Practices: Results from Three Randomised Studies’, Educational Evaluation and Policy Analysis, 20.10. pp.1-20
21.RISE Working Paper 19/027 - Can Public Rankings Improve School Performance? Evidence from a Nationwide Reform in Tanzania, February 2019