Issue and Revision Record
Document reference:
Information class: Standard
This document is issued for the party which commissioned it and for specific purposes connected with the abovecaptioned project only. It should not be relied upon by any other party or used for any other purpose.
We accept no responsibility for the consequences of this document being relied upon by any other party, or being used for any other purpose, or containing any error or omission which is due to an error or omission in data supplied to us by other parties.
This document contains confidential information and proprietary intellectual property. It should not be shown to other parties without consent from us and from the party which commissioned it.
Acknowledgments
This evaluation was carried out by Mott MacDonald Ltd on behalf of the Bill and Melinda Gates Foundation.
This synthesis report was written by Rachel Phillipson, Dr Janet Gruber, Dr Terri Collins, Dr Shawo Mwakilama, Dr Marjan Kloosterboer, Chelsea Stefanska.
Special Studies on which this synthesis also draws were authored by Dr Terri Collins, Dr Shawo Mwakilama, Dr Janet Gruber.
Additional contributions from: Professor Maureen Chirwa, Steven Wanyee, Joseph Dal Molin.
Thanks go to all the health workers and administrators in the districts and the Ministry of Health who gave their time to be interviewed and complete questionnaire to provide the material for this evaluation. Also to Mott MacDonald and Cooper/Smith staff who have supported the fieldwork and the process of quality assuring, fact-checking and producing this and previous reports, including the special studies.
Special thanks to Bright Sibale and his team at the Centre for Development Management in Lilongwe who have conducted three rounds of the KAP survey and organized the conduct of our fieldwork so efficiently over the past 5 years.
Abbreviations
AIDS Acquired Immunodeficiency Syndrome
ANC Antenatal care
API Application Programming Interface
ART Antiretroviral therapy
BHT Baobab Health Trust
BL Baseline
BMGF Bill and Melinda Gates Foundation
CAC Community AIDS Committee
CBO Community based organisation
CDC Centers for Disease Control and Prevention [USA]
CDM Centre for Development Management
CHAI Clinton Health Access Initiative
CHAM Christian Health Association of Malawi
CMED Central Monitoring and Evaluation Division
CRVS Civil Registration and Vital Statistics
CS Cooper / Smith
DC District Commissioner
DDB District Data Bank
DDC District Development Committee
DDE Demographics Data Exchange
DDP District Development Plan
DHA Department of HIV and AIDS
DHAMIS Department of Nutrition, HIV and AIDS Management Information System
DHD Digital Health Division
DEHO District Health Environmental Health Officer
DHIS2 District Health Information System 2
DHMO District Health Management Office
DHMT District Health Management Team
DHO District Health Officer
DIP District Implementation Plan
DNHA Department of Nutrition, HIV and AIDS
DPPD Directorate of Planning and Policy Development
DQA Data quality audit
DUP Data Use Partnership
EGPAF Elizabeth Glaser Paediatric AIDS Foundation
EHR Electronic Health Record
EMR Electronic Medical Record
FBO Faith based organisation
FGD Focus Group Discussion
GAP Global Action Plan for Healthy Lives and Well-being for All
GDHI Global Digital Health Index
GFATM Global Fund to fight AIDS, TB and Malaria
GoM Government of Malawi
HAC Hospital Advisory Committee/Health Facility Advisory Committee
HCT HIV counselling and testing
HDC Health Data Collective
HIE Health Information Exchange
HIV Human Immunodeficiency Virus
HMIS Health management information system
HRH Human resources for health
HSA Health surveillance assistant
HTC HIV testing and counselling
HTS HIV testing services
ICT Information Communication Technology
ID Identification
IDRS Integrated Disease Reporting System
IDSR Integrated Disease Surveillance and Response
IFMIS Integrated Financial Management Information System
IHRIS Integrated Human Resource Information System
IP Implementing Partner
IT Information Technology
I-TECH International Training and Education Center for Health
ITU Information Technology Unit
KAP Knowledge Attitudes and Practice
KII Key Informant Interview
LAHARS Local Authority HIV and AIDS Reporting System
LHT Lighthouse Trust
LIMS Laboratory Information Management System
LIN Luke International, Norway
LMH Last Mile Health
LMIC Low-and Middle-Income Country
LMIS Logistics Management Information System
LTFU Loss to follow up
M&E Monitoring and evaluation
MACRA Malawi Communications Regulatory Authority
MM Mott MacDonald
MoH Ministry of Health
MoLG Ministry of Local Government
NAC National AIDS Commission
NGO Non-Governmental Organisation
NRB National Registration Bureau
NRIS National Registration and Identification System
NTBCP National TB Control Program
OHSP One Health Surveillance Platform
ONSE Organized Network of Services for Everyone’s Health
OPC Office of the President and Cabinet
OPD Outpatients Department
PDD Principles for Digital Development
PEPFAR President’s Emergency Plan for AIDS Relief [USA]
PHIM Public Health Institute of Malawi
PIA Privacy Impact Assessments
PIU Program Implementation Unit
PMTCT Prevention of mother to child transmission
POC Point of Care
PTF Pandemic Task Force
QMO Quality Management Office
RAID Redundant Array of Independent Disks [coding device to prevent permanent loss of data]
SEP Socio economic profile
SOP Standard Operating Procedure
SS Special study
TA Technical Assistance
TNM Telekoms Network Malawi
TWG Technical Working Group
UBR Universal Beneficiary Register
UHC Universal Health Coverage
UIC Unique Identification Code
UNAIDS (Joint) United Nations Programme on HIV and AIDS
UNDP United Nations Development Programme
UNICEF United Nations Children’s Fund
USA United States of America
USAID United States Agency for International Development
VF Verification Factor
VMMC Voluntary medical male circumcision
WHO World Health Organization
Executive summary
Background
The Kuunika: Data for Action Project, 2016-2021, is funded by the Bill & Melinda Gates Foundation (BMGF). The goal of the Kuunika Project is to improve the planning, performance and quality of health services in Malawi through better digital information.
In 2016, BMGF appointed Mott MacDonald (MM) as the independent evaluator for the Kuunika Project, with the aim of: a) generating insights for the better implementation of the project and (ii) eliciting general lessons on how best to introduce new information technology into health information systems in low income settings.
This Final Synthesis Report presents the overall findings of the independent, theory-based evaluation. It builds on the findings of the independent baseline evaluation (2017) and a midline evaluation (2019) to incorporate the findings of the endline evaluation and three in-depth ‘special studies’.
The methodology for the endline evaluation included a data quality assurance (DQA) review, an endline knowledge, attitudes and practices (KAP) survey, key informant interviews (KIIs) and focus group discussions (FGDs) A similar mixed method approach was used for the special studies. Limitations of the endline evaluation methodology (e.g. travel restrictions during the COVID-19 pandemic, and relatively small sample sizes) are acknowledged – as far as possible these have been mitigated through more intensive desk-based reviews and data triangulation processes.
Part 1 of this report presents a synthesis of findings and recommendations from the final evaluation. Part 2 of the report provides a more detailed account of the evidence base generated from the endline data collection round and the special studies. The annexes to the report contain additional information on the evaluation design, theory of change, specific data collection methodologies and findings, and lessons for evaluators. The full reports for the special studies (covering the pivot to COVID-19, digital health governance themes, and district-level responses) have been submitted as a separate deliverable.
Part 1: Endline synthesis and recommendations
‘
Tracking the
Kuunika journey’ through a theory-based evaluation design
From the outset, the Kuunika: Data for Action Project was conceived as a multi-phase project that would progress from formative research, to adaptive implementation, to sustainable scale-up based on adoption of a coherent national health data systems architecture.
In Phase 1 (November 2016 - November 2018), Kuunika’s delivery partners were a consortium of four experienced organisations - Lighthouse Trust (LHT), Baobab Health Trust (BHT), Luke International, Norway (LIN) and the International Training and Education Center for Health (I-TECH). In addition, Cooper / Smith, a data analytics team, sat outside the consortium to provide further technical assistance and analytics capacity.
From Phase 1, the adaptive design of the Kuunika Project has centred on three core pillars: improving data supply; increasing data demand and use; and strengthening data governance. In Phase 1, the intention was to demonstrate the ‘HIV use case’ by targeting a tailored package of data quality improvement and data use initiatives (largely based on Electronic Medical Record (EMR) technology) to five pilot districts in Central and Southern Regions. It was anticipated that the project would then scale up to an additional 22 facilities, while providing ongoing governance and analytics support at national level.
This design of Kuunika Project in Phase 1 formed the backdrop for the evaluators’ baseline assessment and theory-based evaluation design. The evaluation design referenced a theory of change that depicted the project logic model and results chain. This theory of change informed the design of the subsequent ‘theory-based’ evaluations.
The diagram below presents a schematic timeline of how the project rolled-out in practice over the period 2016-2021
In keeping with Kuunika’s responsive project design, project roll-out was characterised by three key adaptation points. These were: i) a pivot to ‘unlocking defined key capabilities’ in HIV services in 2018; ii) [after the independent midline evaluation in 2019], a significant reconfiguration of the consortium,1 followed by Kuunika being appointed as the prime implementer for the Ministry of Health’s new National Digital Health Division;2 iii) a pivot to support the national response to the COVID-19 pandemic through support for new customised digital platforms, data analytics and ‘knowledge translation’.
A responsive evaluation approach
The endline evaluation and special studies have sought to flex to the ‘strategic tacking’ of the Kuunika Project roll-out to assess project progress (from multiple stakeholder perspectives), and to identify lessons and recommendations for future program design.
The 2019 midline evaluation report documented a number of evidence-based achievements, as well as challenges in the early phases of the project. For example, key achievements included good progress in design and implementation of a Health Management Information System (HMIS) smartphone application (app), along with training in linking the new app and digital services to health workers’ routine work needs and procedures. It was also clear that the focus on ‘user capabilities’ was beginning to strengthen Kuunika’s implementation approach. However, there were growing concerns about the limitations of the EMR technology being rolled out in target districts, weak progress in improving HIV data quality and use, and wider concerns about the overall fragmentation of health information system (often driven by siloed donor priorities).
Evidence synthesis from across all data sources at the endline evaluation points to useful learning and effective course correction by the Kuunika team. Since 2019, Kuunika’s Phase 3 Sustainability Phase has largely focused on technical assistance to the National Digital Health Division (DHD). The latter has included secondment of Kuunika staff to the Division (to make up a DHD staff complement of 24 staff members) and technical / convening support for development of Malawi’s new Digital Health Strategy, 2020-2025. Figure 2 below provides an overview of some key achievements found at endline. We note that many of these achievements relate to the strategically important objective of advancing the digital health architecture based on an Open Health Information Exchange (HIE) framework.
Evidence-based lessons and recommendations
The endline evidence synthesis has allowed the evaluators to identify a number of lessons arising from the ‘Kuunika journey’. These largely relate to the three Kuunika pillars of: improving data supply; increasing data demand and use; and strengthening data governance – although there are some important generic lessons and lessons for evaluation methodologies (see Annex 6). Key lessons for further reflection include:
● A need to maintain a strong focus on the supply of quality data at each system level as a foundation for consistent data use for decision-making – there remains evidence of weak data quality in key data repositories
● A need for more differentiated data use strategies at each system level to support a more systematic progression towards an inclusive data use culture
● A need to advance consensus-building on a joint Roadmap for operationalising and resourcing the National Digital Health Strategy – this might also require more rigorous thinking on the challenges of maintaining data quality across mixed / hybrid reporting systems
● The benefits of ‘widening the lens’ on digital health governance to embrace inter-ministerial collaboration, investment in digital health governance capacity and a more comprehensive view of the digital health ecosystem.
The final evaluation has also spotlighted some residual challenges in progress towards the project goal of improving the planning, performance and quality of health services through better digital information. While some of these challenges may now be beyond the scope of the Kuunika Project, they are included in this report to give ‘voice’ to the full range of stakeholder perspectives, and to inform future program design. The evidence-based challenges identified at endline largely related to persistent ‘weak links’ in maintaining data quality and data use at district level and ‘points of care’, and concerns expressed by district-level cadres that they were not regarded as “equal digital health partners”. Some weaknesses were found in digital health governance - the ongoing challenge of maintaining robust data protection measures and protection against cybersecurity threats in this ‘immature’ system setting was also noted.
The evaluators have drawn on a revised theory of change, lessons learnt and the overall evaluation findings to present eight recommendations for the next iteration of the Kuunika Project. These are detailed in Section 1.6 of this report and are summarized in the Box below.
RECOMMENDATIONS FOR A FUTURE KUUNIKA PROJECT
1. Continue to back the National Digital Health Strategy (2020-2025), and the modular approach to building the digital health architecture
2. Prioritize effective mechanisms for convening and coordinating partners around a shared digital health vision
3. Extend the Digital Health Division’s reach and links - up, down AND across
4. Engage districts more in digital health planning, convening and design processes
5. Invest in digital health governance capacity
6. Broaden the conceptualization of the ‘digital health ecosystem’ to engage with the ‘real-world’ political economy and institutional factors that can ultimately determine digital uptake and data use for decisionmaking
7. Have a sustainability and exit plan from the outset to engage with the human and financial resourcing challenges of sustainable digital health solutions
8. As the digital health system progresses towards Shared Electronic Health Records (EHRs), give particular attention to issues of data protection, privacy, and cybersecurity threats
The evaluators have also identified a number of considerations for BMGF and other donors to keep in mind when designing other programs of this nature. Key considerations refer to: a) the need to engage with system complexity and the widespread fragmentation of digital health initiatives in low-income settings; b) the need to strive for more aligned, multi-component and multisectoral approaches to digital health development, while remaining flexible and adaptive; and d) the need to work with other development partners to apply the principles of aid effectiveness and sustainability from the outset.
Part 2: The Evidence Base - Key Findings from Endline Assessments
Key findings by evaluation question
Endline data collection was completed in September-October 2021 to inform the time series analysis in line with the original evaluation design Data collection at endline covered: i) 20 health facilities across four target districts (Balaka, Blantyre, Machinga, Zomba); ii) District Health Management Offices (DHMO) in the target districts; and iii) the Ministry of Health (MoH), Lilongwe. Endline evaluation findings were validated by triangulating findings from across data sources As at baseline and midline, the endline analysis focused on addressing the core set of theory-based evaluation questions Key findings from the triangulated analysis by evaluation question are summarized below.3
● Has data quality improved?
– The endline DQA found that, of the three proxy indicators selected at baseline, only one (# pregnant women testing HIV positive) was reliably available on HMIS / DHIS2. Data quality for this indicator had improved in each sampled district since baseline.
– Loading times: HMIS / DHIS2 web-page loading times had remained stable since early improvements in 2017-18.
– The user assessment found strong examples of clear and user-friendly data visualisations and dashboards on the DHIS2 and OHSP platforms, although there were also some minor issues with data consistency and labelling.
● Has data use increased?
– KAP survey and DQA findings suggested monthly use of HMIS / DHIS2 data had increased over time - across our sample, there were around twice the number of unique monthly users in September 2021 as in September 2018. However, other data use indicators showed relatively little change over this time period.
3 See Annexes 3-5 for key findings by data collection method.
Findings from KIIs and FGDs suggested however that HMIS/ DHIS2 data viewed is still perceived to be of limited value to target users – especially at district and facility levels.
● Have attitudes and practices around decision-making improved?
– With regards sub-national user engagement, there was little sign that perceptions of ‘vertical data extraction’ have changed since baseline. Despite multiple examples of Kuunika engagement with districts, interviews suggested most respondents at this level felt they were primarily seen as sources of data, rather than as collaborators in designing digital systems for use in service delivery.
With regards data quality and use at sub-national levels, there was some evidence from the KAP survey of improved perceptions of data quality. However, there was little evidence of sustained improvements in data quality control practices – especially with the onset of COVID-19. Evidence of sustained improvements in data use practices was mixed: the KAP survey pointed to regular data use for reviewing patient records and stock balances; however, our findings also indicated an overall decline in the frequency of data use activities since baseline.
● Have key [HIV] service areas improved as a result?
– In light of the project pivots and adaptations, the evidence base for Kuunika’s contributions to intended HIV service delivery outcomes remained fragmented and inconclusive.
Key findings of the special studies
Key findings from the three special studies on the pivot to COVID-19, digital health governance, and the role of Kuunika at the district level responses are summarized in Table 1 below. The findings are organized by key achievements, key challenges and key lessons
Table 1: Summary of key findings from the special studies
SPECIAL STUDY KEY ACHIEVEMENTS KEY CHALLENGES KEY LESSONS
• Suite of 12 COVID-19 products developed rapidly & collaboratively
1. COVID-19 monitoring and the digital surge
• Products led to a reputational boost for Kuunika, and demonstrated the convening power of the DHD
• Kuunika successfully supported functions of the public health emergency operations centre & Pandemic Task Force
• With respect to the COVID19 products, some weakness were found re: district engagement & end user consultations; reporting burdens; system errors & updates; accessible technical support; hardware & connectively challenges
• Study identified opportunities for better engagement of eGovernment technicians, capacity building and wider partner engagement –especially at sub-national levels
2. Governancethe Regulatory & Policy Essentials
• Significant contributions to the Expanded Health Data Exchange, and a national digital health vision based on the new Digital Health Strategy and modular OpenHIE framework
• Kuunika has invested significant time, effort & resources in district-level capacity development - especially before 2019
• Governance positions on policy and standards compliance in Digital Health Division have yet to be filled
• Conceptualization of the digital health ecosystem needs to extend beyond technology
• Key legislation & regulations are in place, but ongoing attention must be given to data protection & cybersecurity threats
3. Kuunika and the Districts
• The Blantyre Prevention Strategy is a potential best practice model – it builds on lessons from Kuunika and is fully embedded in district structures
• Districts remain ‘the missing middle’ - evidence suggests districts are still neglected in digital health development
• Successive evaluations point to continued issues of system fragmentation, and weak / unsustainable data quality and data use initiatives
• Kuunika’s focus on the vertical HIV program as the ‘use case’ meant there was limited opportunity for districts to engage
• Customised / differentiated data use approaches are needed at district and national level
The Kuunika: Data for Action Project was launched in November 2016 with the aim of improving the planning, performance and quality of health services in Malawi through better digital information. The Kuunika Project is a US$10 million, four-year project funded by the Bill & Melinda Gates Foundation.
In 2016 the Bill & Melinda Gates Foundation (BMGF) appointed Mott MacDonald as the independent evaluation partner for the Kuunika Project, with the aim of generating: (i) insights for better implementation of the project and (ii) more general lessons about how best to introduce new information technology (IT) into existing government systems.
A primary objective the Kuunika Project was to demonstrate the effectiveness of a customized package of interventions in the context of the HIV /AIDS programme ‘use case’. In line with this objective, the evaluation design aimed to address five top-level questions relating to the project’s role in ‘strengthening HIV- related health data systems’.4 The five evaluation questions were: i) has the quality of HIV data improved? ii) has the use of that data by decision makers and practitioners increased? iii) has decisionmaking improved? iv) have key HIV service areas improved as a result? and v) what explains the changes (or lack of them)?
This Final Synthesis Report is the last report in a series of reports covering Mott MacDonald’s baseline, midline and endline evaluations. The Final Synthesis Report is accompanied by a set of separate ‘deep dive’ Special Study Reports that investigate Kuunika’s recent pivot to respond to COVID-19, digital health governance themes and the role of the project at district level.
PART 1 of this report focuses on a synthesis of key findings from the endline evaluation enquiry, including the special studies. Section 1.1 examines the original project design and theory of change developed by the evaluators at baseline to inform the overarching evaluation design. Section 1.2 describes how the project evolved over time by navigating three key ‘adaptation points’. In Sections 1.3 and 1.4, the evaluators draw on the evidence synthesis to identify lessons emerging from the independent review of the ‘Kuunika journey’, along with a set of residual challenges. These lessons and challenges have informed the evaluators’ reassessment of the hypotheses underpinning the original theory of change (see Annex 1), and lead to the presentation of a revised theory of change in Section 1.5. Part 1 of this report concludes with a set of eight recommendations for the next iteration of the Kuunika Project, as well as a list of considerations for BMGF (and other donors) when designing similar programs of this nature.
PART 2 of this report provides a more detailed account of the evidence base that contributed to the Part 1 synthesis and recommendations. In this section, we present the findings of the endline evaluation – this repeated the mixed method data collection exercise conducted at baseline and midline. Sections 2.1 and 2.3 describe the responsive evaluation framework and methodology for the endline evaluation exercise. Section 2.3 presents the triangulated quantitative and qualitative findings from the endline data collection round. Part 2 of this report concludes with a summary of the main findings from the three special studies.
The Annexes to this report provide more detailed accounts of the evaluation design, the evidence and literature base for the theory of change, the evaluation data collection tools and instruments and more detailed quantitative and qualitative findings. Annex 6 contains an important list of lessons for designing future evaluations of this nature.
Mott MacDonald | Independent Evaluation of the Kuunika, Data for Action Project in Malawi
PART 1: Endline Synthesis and Recommendations
Part 1: Endline Synthesis and Recommendations
Kuunika: Data for Action – an adaptive project design
This section describes the 2016 starting point for the Kuunika Project, including the original intentions, design principles and goals. We also present the theory of change developed by the evaluators to inform the design of the independent theory-based evaluation. This theory of change was the foundational reference point for tracking adaptive changes to the project design and lessons learnt over the course of project implementation.
Phase 1 start-up
From the outset, the Kuunika: Data for Action Project was conceived as a multi-phase project that would progress from formative research, to adaptive implementation and sustainable scale-up based on adoption of a coherent national health data systems architecture 5
In Phase 1 of the Kuunika Project (November 2016 - November 2018), the delivery partners were a consortium of four experienced organisations, namely:
● Lighthouse Trust (LHT): an established provider of HIV services and clinics in Malawi; LHT led on the quality of care and data use components of the Kuunika Project
● Baobab Health Trust (BHT): a Malawi-registered NGO that had been developing electronic medical records (EMR) systems in Malawi since 2001. BHT led on the information technology (IT) infrastructure and EMR components of the Kuunika Project.
● Luke International, Norway (LIN): brought significant experience of working in the northern districts of Malawi. LIN led on strengthening digital health systems using a standards-based health information system using an integrated health information system architecture approach.
● International Training and Education Center for Health (I-TECH): the I-TECH centre is based in the University of Washington’s Department of Global Health. It operates in multiple low- and middleincome countries and has a track record of improving HIV surveillance in Malawi. It focused on training and capacity building for the Kuunika Project.
Additionally, Cooper/Smith, a data analytics team, sat outside the consortium and provided further technical assistance and analytics capacity.
To advance the goal of improved digital health information for decision-making in Phase 1, the Kuunika Project primarily focused on demonstrating the HIV ‘use case’. HIV had been a leading cause of death in Malawi for almost 30 years. This had prompted significant investment by international donors to improve reporting systems and the HIV ‘cascade of care’ to achieve the UNAIDS 90-90-90 target.6,7 This context provided promising conditions for the Kuunika Project to demonstrate its ‘proof-of-concept’ for the link between sound digital data systems, better data use and improvements in service delivery.
From the outset, the Kuunika Project has had three core pillars:
● Improving data supply – making good quality HIV-related health data available and accessible across national, district, facility and community levels of care through the existing District Health Information System 2 (DHIS2) platform.
● Increasing data demand and use at all levels with training, mentoring and incentives.
● Strengthening data governance to implement and enforce data systems standards, operating procedures and practices.
Focussing on the geographical areas with the greatest HIV patient volume, Kuunika took a ‘phased, iterative’ approach to testing a range of interventions in the first two years. Along with initiatives to strengthen the underlying digital health system architecture, interventions were designed to be tailored to facility, district and national levels as follows:
● At facility level, packages of interventions were tailored to the volume of HIV clients and focused on roll-out of EMR systems (including infrastructure improvements), data use training and mentoring.
● At the district level, interventions included training in data use for the District Health Management Team (DHMT) and District Council, supplemented by support and mentoring from a dedicated Kuunika district monitoring and evaluation (M&E) officer.
● At the national level, initiatives focussed on technical assistance to strengthen health data governance, analytics and procedures.
In Phase 1, interventions were intended to reach 22 facilities in five pilot districts of Central and Southern Regions (Blantyre, Lilongwe, Mangochi, Thyolo, and Zomba), with the intention of scaling up to a further 22 facilities in additional districts in Phase 2.
The Evaluators’ Theory of Change (Phase 1)
The project theory of change sets out how the project’s interventions are expected to lead to intended results. The theory of change is the cornerstone of a theory-based evaluation. It is the principal reference point for testing hypotheses and contributions to intended results. It also allows lessons to be identified for future project design.
Since the Kuunika project design documents did not include a theory of change, the evaluators reconstructed one at the beginning of Phase I to inform the evaluation design. The reconstructed theory of change was based on consultations with the Kuunika team, review of the project design documents and a review of the relevant literature to establish the evidence base.
The reconstructed theory of change developed in Phase 1 is represented in the diagram below (Figure 3).
Mott MacDonald | Independent Evaluation of the Kuunika, Data for Action Project in Malawi Final Synthesis Report Draft 1, December 2021
This theory of change diagram was constructed for the 2017 baseline evaluation and shows how the Kuunika ‘investments’ or activities were aligned to a results chain of intended output, outcome and impactlevel results. The logic of the theory of change is summarized in the Box below.
Box 1: Summary description of the Kuunika Theory of Change (Phase 1)
IT infrastructure investments lead to improved ‘data outputs’: evident and measurable improvements in the amount of time the databases are available to the user (‘availability’) and in the ease with which the data can be retrieved, combined and viewed (‘accessibility’).
Many new ways of combining and analysing data thereby become possible. These possibilities will be realised through training and capacity building to provide targeted users with skills and incentives to use the new data systems, while the creation of Ministry of Health (MoH) - wide data governance structures will harmonise and maintain data standards, underpinning users’ trust in the quality of the data and encouraging its use.
The development of the theory of change was underpinned by a review of the evidence base and identification of hypotheses relating to pathways to results. For a summary of the evidence review and underlying hypotheses for this theory of change, see Annex 1.
Kuunika: pivoting, adapting, responding
This section examines how the Kuunika project has pivoted and adapted over time to respond to changes in the operational and institutional environment and the onset of the COVID-19 pandemic This section also summarizes the main achievements and challenges identified at key evaluation milestones
The project timeline
Figure 4 below provides a schematic overview of key milestones and points of adaptation as the projects moved from the Phase 1 and 2 design and early implementation stages, to the Phase 3 sustainability stage.8
Phase 2 and the first project pivot
By November 2018, Kuunika had completed a number of important planning and preparatory activities. However, implementation progress was slow, in part, due to delays in recruitment of a Project Director (hired in April 2018) and other general staff. Individual Consortium members were preoccupied with prior commitments and clients (including a long-standing CDC-funded initiative to develop and roll-out EMRs). This convergence of factors effectively locked Kuunika into an implementation schedule based on roll out of older (self-standing, tablet-based) technology, and limited the extent to which the team was able to focus on the national and district-level dialogue necessary to advance the vision of EMRs as part of an integrated digital health system
In June 2018, BMGF advised Kuunika to refocus its efforts and show more convincing progress with a clearly defined, swiftly delivered, ‘core package’ of deliverables. A ‘project pivot’ was, therefore, developed to direct new and existing activities to ‘unlocking defined key capabilities’ in HIV services. Key features of the pivot included:
Mott MacDonald | Independent Evaluation of the Kuunika, Data for Action Project in Malawi Final Synthesis Report Draft 1,
● A tighter focus on existing facilities, rather than rolling out to new ones
● Accelerated delivery of specific data products and a revised training approach.
● More sustained attention to the underpinning health information system architecture, with new targets for the achievement of the Master Health Facility Registry, the Demographic Data Exchange (DDE) and the interoperability layer.
The project pivot began in November 2018 and ran through to the end of April 2019. This was followed by an Implementation Review and midline evaluation by Mott MacDonald in mid-2019 to further inform the design of Phase 2.
Second adaptation point
The independent midline evaluation (April-June 2019) confirmed that the project pivot begun in November 2018 was justified and well-conceived in prioritising a ‘core package’ of selected activities. However, the evaluation also found Consortium partners were still struggling to deliver the agreed core package – in part due to competing implementation priorities and delivery schedules for their funders. Evaluation visits to the districts and discussions with facility staff surfaced persistent concerns about the poor state of basic IT infrastructure and staff capacity to use digital health technology, including EMRs – this, in turn, had implications for the quality of reported HIV data.
The midline evaluation also highlighted the complexities of the digital health governance environment in Malawi. The evaluation findings pointed to an increasing number and variety of digital health initiatives at district level. There was also little convergence between donors on approaches to digital health innovation, including roll-out of EMRs.9
A complete summary of key Kuunika achievements and challenges identified at the midline evaluation is included in Annex 2 Figure 5 shows the main achievements, challenges and issues of wider concern identified at the midline evaluation.
Mott MacDonald | Independent Evaluation of the Kuunika, Data for Action Project in Malawi
After the midline evaluation, the Kuunika Project restructured and streamlined its core partnershipsleaving LIN and Cooper/Smith as the Lead Agencies, and other partners providing consultancy inputs as needed. As a consequence, Kuunika dropped its involvement in EMR roll-out at the district level 10 Instead, it responded to MoH’s request to support the new Digital Health Division and implementation of the new National Digital Health Strategy (2020-2025) over Kuunika’s final
Sustainability Phase
Since May 2019, there have been two Presidential elections in Malawi. These have resulted in changes in political leadership and the institutional arrangements for the Digital Health Division Initially, the new Digital Health Division was located under the Quality Management Directorate, but since early 2020 it has been located under the Directorate of Planning and Policy Development, where it has experienced a period of relative institutional stability
Phase 3 and the third adaptation point
The first cases of COVID-19 were reported in Malawi in March 2020. The Government of Malawi acted swiftly to introduce containment measures, despite the absence of a coherent pandemic plan.
Recognizing that digital health technology could play a key role in monitoring and containing the pandemic, MoH’s Public Health Institute of Malawi (PHIM) engaged Kuunika to develop a set of digital solutions to support pandemic surveillance and tracking, data analytics, case management and data use for decisionmaking
Along with the 24 staff it supported at the Digital Health Division, the Kuunika team worked with key MoH structures and role-players11to develop a number of COVID-19 digital products This included, most notably, a DHIS2 mobile-based tool for recording and reporting disease surveillance data that built on UNICEF’s ‘One Health’ Surveillance Platform (OHSP).12 In addition, smart phones and tablet devices were distributed to District Environmental Health Officers (DEHOs), health facilities and ports of entry across all 28 districts to support case-based surveillance, contract registration and follow-up.13 A further set of digital support tools (including a national information dashboard, and systems for COVID-19 vaccination reporting / certificates) were designed to support the functions of the Public Health Emergency Operations Centre at PHIM
A summary of key verified Kuunika achievements at endline evaluation is shown in Table 2 below. We note that several of these achievements relate to the strategically important objective of advancing the digital health architecture based on an Open Health Information Exchange (HIE) framework.
10 This was also driven by a shift in CDC’s EMR investment from BHT to the Elizabeth Glaser Pediatirc AIDS Foundation (EGPAF)
11 These included: MoH senior management teams, the Health Cluster, the President’s Task Force, the National Task Force on COVID-19 vaccines and Disaster and Risk and Recovery Management
12 UNICEF initiated OHSP development with LIN in 2019 to support integration of surveillance data on human health, animal health and ecosystem health.
13 This activity was assisted by other partners, such as ONSE, CHAI, PIH, Last Mile Health
Mott MacDonald | Independent Evaluation of the Kuunika, Data for Action Project in Malawi Final Synthesis Report Draft 1, December 2021
Table 2: Summary of key Kuunika achievements at endline
KEY KUUNIKA ACHIEVEMENTS
● Development of key ‘business domain services’ such as the Logistics Management Information Service (LMIS) and Health Management Information Service
● Development of key Registry Services such as the Terminology Service and Facility Registry
● Development of a generic Application Programming Interface (API) for linking any system to the Interoperability Layer, including validation of core metadata during exchange, system tracking and error logs, and routine exchange between two core sub-systems (the HIV/AIDS Management Information System and OpenLMIS)
● Development of HMIS dashboards and a ‘Health Situation Room’ platform
● Development of a mobile Supportive Supervision Application for collecting District HIV/AIDS data and technical support for data use initiatives in Zomba District
● Technical leadership for COVID-19 products including the One Health Surveillance Platform (OHSP), Surveillance Community Tools, an Incident Management System and a COVID-19 Website
Reflecting on lessons learnt
This section reflects on the important lessons emerging from the Kuunika implementation experience in Malawi This section is primarily based on evidence from the endline evaluation; however, it is also informed by the evaluator’s reflections on earlier evaluations and the ‘Kuunika journey’ over time. This section also considers lessons for evaluation methodologies
The lessons synthesis
The following synthesis of lessons from the Kuunika Project is primarily derived from evidence from the endline evaluation. Endline evidence sources included: the endline KAP survey (KAP); the data quality assurance (DQA) reviews; focus group discussions (FGD); key informant interviews (KII); and a deskreview (DR). These were supplemented by in-depth special studies (SS) (see Part 2 of this report).
The lessons synthesis is also informed by the evaluators’ overview of the ‘Kuunika journey’ and a review of findings from baseline and midline evaluations
The following lessons are organized around the three Kuunika pillars of: improving data supply; increasing data demand and use; and strengthening data governance. Some additional generic and methodological lessons are included at the end of this section. The evidence source for each lesson is indicated with the respective abbreviation.
Lessons on improving data supply
● Maintaining the supply of quality data: Experience from the Kuunika Project supports a growing body of evidence on the interdependency between data quality and data use.14 While the Kuunika team’s support for strengthening the digital health system (combined with data use campaigns), are likely to have contributed to improvements in data use, there are persistent challenges of poor data quality in key data repositories (DQA, KII, SS) Reference to the wider literature base suggests projects of this nature need to remain multi-component (or strategically aligned) to ensure data use initiatives are well supported by data quality improvement measures at each system level (DR) 15,16
Lessons on increasing data demand and use
● Differentiated data use strategies: Kuunika has remained focused on improving data use for decisionmaking at each phase of the project; however, the focus of data use initiatives has shifted between different system levels (KII, DR) Evidence from the literature base highlights that differentiated approaches are needed if data use initiatives are to be effective across different health system levels and thematic areas (DR).17 Monitoring, evaluation and learning from data use interventions could be enhanced if targeting strategies were more explicitly defined, along with criteria, indicators and timeframes for measuring success.
● Demand for data: The experience of convening partners and technical expertise to respond to the COVID-19 pandemic (with a focus on new digital platforms data analytics and knowledge translation / support) has demonstrated how demand for data can drive collaborative working and incentives for data use (KII, SS)
14 PATH. (2019). Immunization Data: Evidence for Action. A Realist Review of What Works to Improve Data Use for Immunization, Evidence from Low- and Middle-Income Countries [précis]. Seattle: PATH, Pan American Health Organization.
15 Ibid
16 For example, the above study found that data quality assessments at facility level, data review meetings at district level and inter-ministerial peer networks at national level can all contribute to a synergistic demand of high quality immunization data.
17 World Health Organization. (2019). A Realist Review of What Works to Improve Data Use for Immunization Evidence from low- and middle-income countries. Precis available at: 2_YellowBook_PATH_IDEA_Precis_L5.pdf (who.int)
Lessons on strengthening data governance
● From Strategy to Roadmap: The new National Digital Health Strategy (2020-2025) provides the Malawi health sector with a strong strategic direction for advancing the digital health system to maturity (KII, DR, SS). However, more progress is needed on developing a credible, joint ‘Roadmap’ for operationalising and resourcing the Strategy (DR) 18
● A wider lens on digital governance: There is potential for strengthening linkages to other ministries and structures working on ICT themes in Malawi. More specifically, there is a need for greater harmonisation and alignment on digital governance, including on legislation and policies, technical standards and compliance, and workforce development. The Department of e-Government (whose staff are seconded to the ICT sections of MoH) offers a potential conduit for greater inter-ministerial collaboration Issues of data security, protection, national registration and biometric identifiers are priority themes for ongoing dialogue (KII, DR, SS)
● Digital health governance capacity: There remains scope for strengthening governance capacity in the Digital Health Division. Several posts in the organogram for the Division (covering governance, standards and compliance) have remained vacant (KII, DR). Finalisation and roll-out of draft Standard Operating Procedures, and inputs to the provisions of the 2021 Data Protection and Privacy Bill are now a priority (DR).
● Digital public goods: There is strong stakeholder consensus that the presence of Kuunika technical support for the Digital Health Division, combined with a robust Digital Health Strategy (2020-2025), and a shared vision for the national digital health architecture creates new opportunities for improved harmonisation and alignment of donor-supported initiatives. However, to date there is limited evidence of donor / partner buy-in to concepts of digital public goods that underpin this national vision.19 Advancing this vision needs strong and concerted leadership and advocacy (nationally and internationally).20
Generic lessons
● Expanding the concept of the digital health ecosystem: Several key informants described how the Kuunika Project has navigated a course through a changing context. Some alluded to a series of operational fault lines or divides (e.g. between a community /district and national-level focus; between donor and government reporting systems; and between the competing delivery / learning priorities of Implementing Partners) (KII, SS). A broader view of the ‘digital health ecosystem’ that is informed by regular stakeholder and political economy analysis could help the Kuunika team head off some of the tensions associated with these operational fault lines.
● The transition to transformation: The reality of implementing digital health projects is that there can be a long transition from the ‘legacy system’ to the ambitions of digital health transformation The literature base suggests that in developing the Roadmap to digital health transformation, there needs to be stronger conceptualization of the interim or transition phase to ensure data quality and reliable reporting is maintained across mixed / hybrid information systems (DR) 21
Lessons for evaluation methodologies
● Applying the principles of a theory-based evaluation: The highly adaptive nature of the Kuunika Project has created a number of challenges for design of the independent theory-based evaluation, as well as for data collection and analysis. Lessons learnt and the implications for future evaluations of this nature are detailed in Annex 6 of this report.
18 Tanzania’s Data Use Partnership Roadmap provides a strong example of what a Roadmap process and product might look like. Available at: https://path.azureedge.net/media/documents/DHS_health_tanzania_rpt1.pdf )
19 Digital Public Goods Alliance. (2021). Understanding the Relationship between Digital Public Goods and Global Goods in the Context of Digital Health. Available at: https://digitalpublicgoods.net/DPG-GlobalGoods.pdf
20 Ibid
21 Sahay, S., Nielsen, P., & Aanestad, M. (2019). Institutionalizing Information Systems for Universal Health Coverage in Primary Healthcare and the Need for New Forms of Institutional Work. Commun. Assoc. Inf. Syst., 44, 3.
Remaining challenge areas
This section draws on evidence from our endline evaluation and special studies to identify some residual challenges or ‘weak links’ in achieving the original Kuunika goal. While some of the challenges may now be beyond the direct scope of the Kuunika Project, they are presented here to give ‘voice’ to stakeholder feedback at all system levels, and to inform future program design
Rationale
Evidence from the endline evaluation has spotlighted some persistent challenges in progress towards the project goal of improving the planning, performance and quality of health services through better digital information.
The evaluators acknowledge that, given Kuunika’s Phase 3 focus on supporting the Digital Health Division, some of the challenges listed below are beyond the control or scope of the project team. Nevertheless, the challenges below are evidence-based (drawing on triangulated evidence from the endline survey, data quality assurance review, stakeholder discussions, key informant interviews and special studies – see Part 2 of this report). They are presented as a starting point for consideration in future program design
Residual ‘weak links’
The remaining challenge areas or ‘weak links’ identified at endline included:
● The human/manual transfer of data from paper sources to DHIS2 is a weak link in maintaining data quality At health facility level, new programs with demanding monitoring expectations are pushing inadequately-resourced data clerk tasks to the limit (KAP, KII, SS)
● Weak support for hybrid systems: Endline evaluation evidence suggested that existing technical support, quality control and supportive supervision mechanisms need strengthening to address the ‘real world’ mix of electronic and paper-based systems at facility and district levels. Key informant and expert interviews elicited the following suggestions for additional support:
– Targeted support to data clerk and HMIS officer functions, possibly by replicating the wellestablished Department of HIV & AIDS (DHA) supportive supervision process and / or leveraging the support of e-Government officers at district level.
– A comprehensive review and refresh of procedures / forms for data entry into DHIS2 to weed out old forms and registers and bring all up to a basic manual
Targeting of data use support to in-charges and senior nurses to participate in supportive supervision, so that they see more ‘point of care’ data, take ownership of quality issues, and nurture ‘enthusiasts’
● Weak inclusion of end users at district level: There is a continuing complaint from District Health Officers (DHOs) and health workers that they are not heard or included in the design and development of digital health initiatives – be these led by government or donors These stakeholders report they have not been consulted on the design of SOPs relating to key digital health governance and operational procedures. With the exception of Zomba District, there seems to have been minimal engagement on managing COVID-19 digital data. In short, District [and facility] Officers report they were not adequately engaged as “equal digital health partners” (KII,SS)
● The difficulty of navigating DHIS2 for less experienced users despite the introduction of visualisations tools and a doubling in users (from a low base) in 2017. Our survey respondents at district and facility levels reported that they refer to the DHIS2 / HMIS data relatively rarely. It appears that a virtuous circle of increasing data use for decision making is still to be initiated.
● Infrastructure and connectivity constraints: While it is acknowledged that Kuunika, other digital partners and government itself have continued to invest in improved connectivity and digital access in
Malawi,22 investigations conducted for Special Study 1 indicated that over 60% of users continue to experience connectivity challenges, and 70% use their own money to procure internet bundles
● Ongoing issues of system fragmentation: Successive independent reviews and evaluations continue to highlight issues of fragmentation of the digital health information system, especially at facility and district levels.23 Endline evaluation findings consistently pointed to the need for better donor coordination to limit the data reporting demands of new initiatives, and to ensure compliance with established standards and harmonisation and alignment principles.
● Keeping pace with the multiple challenges of good digital health governance: As discussed in Section 1.3 and Special Study 2, there are key areas of digital health governance that need ongoing attention and investment – with the challenge of optimising the balance between human rights, data protection and public health priorities being a central and cross-cutting concern.
22 See: World Bank. (2021). Malawi Economic Monitor: Investing in Digital Transformation. June 2021
23 See, for example, Vital Wave. (2019). Assessment of EMR Systems in Malawi: Initial Landscape Assessment. Prepared for MoH, Malawi.
The revised Theory of Change
Based on a review of the ‘Kuunika journey’ over the past five years, the original theory of change hypotheses (Annex 1), lessons learnt, challenges and consultations with key informants, the evaluator’s have drafted a new general theory of change for the Kuunika Project This version of the theory of change is intended to inform dialogue on the next iteration of the project design.
Figure 7 below presents the evaluator’s proposed theory of change for the next iteration of the Kuunika Project This theory of change builds on some Kuunika activities that have already commenced; however, it also references some activities that should be considered to advance progress towards intended results.
In this version of the theory of change, the project’s intended outcome and impact-level results are now articulated in terms of improved planning capabilities, improved core health services, and improved health outcomes, cost-effectiveness and impact. All these result areas are expected to be underpinned by sustainable institutional change
Key task areas will encompass activities relating to:
● Improving or ‘re-vamping’ the digital health system architecture
● Enhancing data availability, accessibility and usability
● Building the capacity of application / system developers, data producers and users
● Building the foundations of sound digital health governance
In combination these task sets and activities are expected to produce strategic outputs relating to:
● New data use services - such as dashboards, tools and reporting services to support timely evidence-based decision-making at all system levels
● New developer and data user skills – with capacity to generate, access, use and interpret data at all system levels
● Sectoral change based on an inclusive digital health culture in which all target users can engage with innovation and access timely, quality data to inform policy, clinical / point of care and resource management decisions that will contribute to improved quality health care.
Together, these outputs are intended to provide pathways to the intended outcome and impact results They will also contribute to a sustainable institutional environment that supports the objectives of successive Malawi Health Sector Strategic Plans.
Mott MacDonald | Independent Evaluation of the Kuunika, Data for Action Project in Malawi
Recommendations for future project design
This section draws on the revised theory of change, lessons learnt and the evaluation findings to present recommendations for the next iteration of the Kuunika Project
Table 3: Principal recommendations Recommendation
1. Continue to back the National Digital Health Strategy, 2020-2025, and the modular approach to building the digital health architecture
Comments
These form the core elements of a shared digital health vision. All partners and donors should be urged to observe the principles of responsible ‘global digital health citizenship’ and ensure their digital health interventions link both horizontally and vertically into the agreed digital health framework and observe common standards. This step should form the basis of consensus-building on a joint Roadmap for operationalising and resourcing the Digital Health Strategy.
2. Prioritise effective mechanisms for convening and coordinating partners around a shared digital health vision
Fragmentation of the digital health system has remained a feature of successive system reviews. The experience of successful partner collaboration to meet data demands for the COVID-19 response provides an important opportunity to ‘build back better’. There are likely to be important lessons from earlier experience of Sector Wide Approaches in Malawi to guide this endeavour.
3. Extend Digital Health Division’s reach and links - up, down AND across
As well as ensuring that the Digital Health Division has appropriate technical capacity to work with other higher levels of government such as the Department of e-Government on digital health governance, standards and compliance, and Department of Planning and Development on finance and monitoring, the Digital Health Division should engage with the district level structures of these departments e.g. for cascading IT and M&E support down to the facilities.
4. Engage districts more in digital health planning, convening and design processes
Consider making District Health Officers members, rather than recipients, of central funding. Donors should be alerted to the important formal responsibilities of districts in service delivery and help them mature them as players in the national health sector and digital health planning processes
5. Invest in digital health governance capacity
Support efforts to ensure the Digital Health Division has appropriate capacity on digital health governance, standards and compliance Governance officers should work closely with the ICT Ministry, prioritize finalisation of the digital health SOPs, and facilitate inputs to Malawi’s Data Protection and Privacy Bill. Capacity inputs should extend to sub-national levels to ensure digital health legislation, regulation and procedures are understood, implemented and enforced at each system level
6. Broaden the conceptualization of the digital health ecosystem to connect with wider political economy and institutional factors that are likely to determine digital uptake and data use for decision-making
7. Have a sustainability and exit plan right from the start that engages with the human and financial resourcing challenges of sustainable digital health solutions
8. As Malawi’s digital health system matures and progresses towards Shared Electronic Health Records (EHRs), give particular attention to issues of data protection, privacy, and cybersecurity threats
For example, consider that in many health sector settings in Malawi, better data-driven decision-making currently lacks the contextual drivers of autonomy and resources to gain traction Continue to focus on initiatives at the district level, including digital solutions that reduce workloads and complexity for over-burdened and underpaid staff Acknowledge the lynchpin / gatekeeper role of facility-level data clerks and district HMIS officers – focus on ‘holistic’ initiatives to ensure they have access to customised, ‘user-tested’ tools and technical / supervisory support
This sustainability and exit planning should also extend to Kuunika’s support to the Digital Health Division to ensure there is systematic and timely skills transfer, along with resource planning, to ensure technical support does not become ‘capacity substitution’. Ensure Kuunika maintains a clear separation between governance and implementation roles to avert any stakeholder concerns about conflict of interest.
New digital health solutions should be accompanied by key assessments, such as threat risk and privacy impact assessments. There is scope for working more closely with the ICT Ministry, international collaboration, and contributing to legislative dialogue e.g. on Malawi’s Data Protection & Privacy Bill. There may also be considerable mileage in supporting efforts to implement and enforce (through training, SOPs, supportive supervision etc) existing legislation and regulations.
Considerations for other digital health initiatives
This section takes a ‘step back’ to offer some thoughts to the Bill and Melinda Gates Foundation – and other donors - on wider lessons for digital health programs in resource-constrained settings.
● Digital health projects probably cannot operate in only the health sector. When investing in a specific use case, be clear about the concurrent investments needed at other system levels, and those that need to be addressed at a national, government-wide, level.
● Digital projects need multiple entry points. The causal chains are not simple or linear. The simple logic of the original Kuunika design is appealing but did not stand the test of reality. A ‘building blocks’ approach can support flexibility, phased implementation and system transition planning.
● Digital projects need to remain highly adaptive to the changing context, as well as technical developments and requirements. The Foundation’s flexible approach to the funding and management of Kuunika has been the right one, not always seen in other donors.
● A digital health intervention cannot design systems without properly thinking of who will use them, and how they will use the systems in the health centres and districts.
● IT programs are more than just hardware. Much of Kuunika’s success has been in building relationships (formal and informal) across government, creating a flexible team, providing foundational landscape maps and statistics and growing a pipeline of able staff. (It is important to acknowledge, however, that basic infrastructure and hardware can be a key determinant of success)
● BMGF has an important role in promoting the ‘global goods’ nature of IT development on open source systems.
● Understanding the political economy in Malawi, and including major donors, is key and needs to begin from the start. Evidence from Kuunika is that progress may have been considerably influenced by how well potential ‘fault lines’ between departments and between donors have been identified and negotiated over time.
● Political economy is not just a one-off inception exercise but an ongoing skill or project function, requiring experienced project staff able to connect and operate within the system.
● Digital health investments tend to primarily serve the disease focus of donors and their reporting systems; promote harmonized Supplier procurement, as well as buy-in to a shared interoperability infrastructure.
● Long standing principles of aid effectiveness and digital development can be brought together and grounded by giving more force and responsibility to national digital forums – but there is a need to take them beyond just talking shops to become real delivery support bodies
● Do some fresh thinking on sustainable capacity building for digital governance: go beyond the short term solution of funding and filling posts in line ministries. Peer to peer organisation support and/or BMGF fellowship schemes may be alternatives.
● ‘Use case’ approaches to digital health investments make sense: they provide a bridge to the health workers at the facility level and the realities of working on the ground.
● But ‘use case’ approaches need to be complemented by a systems focus that works to promote digital health governance at all levels.
PART 2: Endline Surveys and Special Studies
Part 2: Endline Surveys and Special Studies
Transitions in the evaluation approach
This section summarizes the overall evaluation design and the modifications made along the way to capture Kuunika’s pivots and adaptations. Limitations to the evaluation approach are also explained. Mott MacDonald was appointed in 2016 as the independent evaluator of the Kuunika Project, with the aim of generating: (i) insights for better implementation of the project and (ii) more general lessons about how best to introduce new information technology (IT) into existing government systems in Malawi.
The evaluation originally aimed to answer five top-level questions relating to the project’s role in ‘strengthening HIV- related health data systems’. The five evaluation questions were: i) has the quality of HIV data improved? ii) has the use of that data by decision makers and practitioners increased? iii) has decision-making improved? iv) have key HIV service areas improved as a result? and v) what explains the changes (or lack of them)?
In response to the pivots and changes in the project, and intentions for a follow-on phase, the evaluators also aimed to address to supplementary questions in the synthesis stage. These were:
● How effective has Kuunika’s sustainability and COVID-19 pivot phase been?
● What should Kuunika II look like?
Our impact evaluation used a mix of quantitative and qualitative surveys conducted at baseline, midline and endline, plus a set of special studies to explore agreed areas of the theory of change more deeply.
The baseline evaluation conducted in August 2017and the midline evaluation was conducted in April 2019. This section of the report (Part 2) sets out the findings of endline evaluation – which included a repeat of baseline/midline mixed method data collection exercise and three ‘deep-dive’ special studies. The findings from the endline evaluation, together with reflections from the earlier evaluations, have formed the evidence base for the Part 1 synthesis and recommendations.
To capture the changes in the project, and respond to recent restrictions imposed by the COVID-19 pandemic, the midline and endline surveys involved some modifications from the baseline. Although, technically, this was a deviation from evaluation and research best practice, this responsive approach allowed us to maximize the lesson-learning value of the evaluation.24 The main modifications to the evaluation approach at midline and endline were:
● The midline covered a different set of districts to the baseline. In 2017, we covered Blantyre and Machinga Districts as ‘treatment’ districts and Balaka District as a comparator. At the request of the BMGF and Kuunika, we swapped Machinga district for Zomba District at midline.25 Zomba District had become a district of focus for an accelerated core package of interventions in November 2018, so it was agreed that evidence from Zomba District would be more useful for informing decisions on Phase 2 of the project.
● The endline evaluation has covered all four districts,26 including both Machinga and Zomba. This has allowed us to retain as much time series data as possible.
● The international team did not travel to Malawi for the final round of endline data collection. Due to COVID-19 travel restrictions and safety measures, the endline data collection had to be overseen and supervised remotely. In practice, this meant: the KAP survey was conducted face-to-face by local enumerators (with appropriate safety measures in place); focus group discussions were
24 This responsiveness might also be seen as consistent with BMGF’s preference for evaluation work to support ‘co-creation’.
25 The districts were originally selected to be as similar as possible in influential contextual factors such as district-level administrative capacity, urbanisation and geographical accessibility of facilities and HIV/AIDS prevalence. The new midline district, Zomba, was similarly consistent.
26 As a result, the 2021 KAP survey was larger than the previous two rounds, with N = 220+ in 2021 compared to around 180 in 2017 and 2019.
conducted ‘semi-remotely’, facilitated by a local team member; and key informant interviews were largely conducted virtually from the UK. The final data quality audit was conducted as a desk review in UK, and focused primarily on the DHIS2 user interface; at endline it did not include a systems assessment.
With regards the special studies, these were originally intended to explore key areas of the Phase 1 theory of change. However, the final selection of study questions aimed primarily to understand the implications of key issues and events during project implementation, with the ultimate aim of informing a revised theory of change.
The limitations to the evaluation design and methodology at endline are acknowledged. They are systematically described in the table below, along with mitigating action taken.
Table 4: Summary of limitations of the endline evaluation design and methodology
Limitation Description Implication and Mitigating Action
Small sample sizes
Some changes in panel sample of districts over time
Some KAP survey questions omitted at endline
Availability of key stakeholders to participate in remote KIIs
Project design and budget limited feasibility and value of larger sample size
Original intended phased roll-out of implementation across identified districts was dropped as project approach evolved over time
Possible programming errors and limited remote real time QA of fieldwork resulted in some data collection errors
Despite extensive sample frame of contacts, there were significant problems in setting up KII by phone, particularly with facility and district staff
Multiple respondent categories imply small numbers so that quantitative data is not statistically robust. We use ‘small n’ evaluation methodology and aimed to ensure the validity of findings through triangulation techniques.27
Original intention to treat Balaka as a comparator district could not be carried through. Some implications for the time series analysis. To maintain rigour, final analyses did not include disaggregation of data by district.
Implications for time series analysis. Only findings that endline survey findings that are fully evidence based have been included in this report.
Delays in completing target numbers and range of KII participants. FGDs at district level facilitated by national team member filled some gaps. Limited interview data for endline and special studies (especially Special Studies 2 and 3) were mitigated through intensification of document and secondary data reviews.
On-site DQA data verification against source documents and system assessment not possible
COVID-19 travel restrictions
COVID-19 safety measures required significant modification of original facility-based DQA method
Mott MacDonald’s duty of care responsibilities and COVID-19 response meant safety was prioritized and international travel was strictly limited during 2020 and 2021
An alternative approach of photo-capture of register pages and summary forms by the enumeration team during KAP survey was introduced; however, some blurred photos limited number of observations. The intended omission of system assessment component reduced qualitative data, but this was partly compensated for by ad hoc remote KIIs
International team members did not visit sites or interview stakeholders in person. However, trainings, regular debriefs and key informant interviews were conducted remotely. With minor exceptions, the same evaluation and enumerator teams that were used for baseline and midline evaluations were used – they were, therefore, highly familiar with the evaluation context and methodology, and experienced in using the evaluation data collection tool-kits
The endline fieldwork
This section describes the methodology used for endline data collection, including the sampling strategy and mixed methods deployed. As far as possible, this methodology replicated the approach used at baseline and midline to support a time series analysis. At endline, some adjustments were made to respond to the COVID-19 pandemic
Sampling strategy
The third and final round of mixed method data collection was conducted during September-October 2021 across all levels of the health service The data collection round covered:
● 20 health facilities across four districts (Balaka, Blantyre, Machinga, Zomba)
● District Health Management Offices (DHMO) of the four districts
● The Ministry of Health in Lilongwe.
The 20 health facilities (five in each district) were purposively selected to be government-managedand to include a range of health facilities that were high, medium and low volume in terms of HIV patient attendance 28 The 15 facilities in Blantyre, Balaka and Machinga Districts were the same as in the 2017 baseline. Notably, the five facilities in Zomba District replaced those in Machinga District at midline, but we included all four districts at endline (this increasing the sample of facilities from 15 to 20)
Of 20 facilities sampled at endline, 15 were health centres from the third and lowest tier of the health system, serving primarily peri-urban communities. Four (one per district) were district hospitals from the second tier of Malawi’s health system – these served more urban populations and handled referrals from health centres and other third tier facilities. One tertiary hospital (Zomba Central) was from the top tier of the health system. The table below provides an overview of the geographical coverage of the sample (see Annex 4 for a map showing the location of sampled health facilities).
Ministry of Health Department of HIV/AIDS (DHA); Quality Management Unit (QMU);
Monitoring & Evaluation Department (CMED)
28 The following definitions were used based on a quarterly estimate: low volume: fewer than 1,500 HIV positive patients alive and on treatment; medium volume: 1,500 – 3,000 HIV patients alive and on treatment; high volume: over 3,000 patients alive and on treatment. All estimates were based on Quarter 4, 2016 data (source: DHA, quoted in PEPFAR EMR target list).
Mixed method data collection
As far as possible the endline data collection methods replicated those used at baseline (with some adjustments to accommodate COVID-19 travel restrictions). The endline data collection methods used at endline are summarized below.29
● A knowledge, attitudes and practice (KAP) survey explored perceptions of health data quality, individuals’ use of data at work and organizational drivers of data use in decision-making. The survey questionnaire was the same as in 2017 and 2019,30 with three additional questions referring to COVID-19 monitoring data. The survey was conducted face-to-face by the local survey company. The enumerators captured 223 full, unique, KAP survey responses.
● Stakeholder interviews included key informant interviews (KIIs) and, where feasible and appropriate, focus group discussions (FGDs) The KIIs were largely organized and conducted remotely from the U.K. using Zoom or Teams video-calling software.31 In total, by mid-October, we had completed 15 KIIs and 6 FGDs involving 25 participants.32
● A Data Quality Audit (DQA) and User Assessment was used to objectively assess whether data quality in DHIS2 had improved over the life of the project These assessments were informed by a key hypothesis in the original theory of change that use of digital data depends on users’ trust in the quality of the data. As in 2017, four dimensions of data quality (completeness, accuracy, availability, and timeliness) were assessed for three proxy indicators, purposively selected as important to HIV service delivery and planning. For this exercise, the sample period was the threemonth period from 1 January to 31 March 2021 (first quarter 2021). Data elements were collected for the following three indicators:
– Indicator 1: Number of pregnant women newly testing HIV positive
– Indicator 2: Number of HIV positive clients initiated on ART for the first time
– Indicator 3: Number of nurses posted at the facility.
Notably, the DQA was the most modified data collection activity since the 2017 baseline This was partly due to pandemic travel restrictions,33but was also to examine the effects of digital innovation (in line with Kuunika objectives). All relevant paper forms for the data elements from each of the facilities were photographed by the enumeration team at the same time the KAP surveys and uploaded to a shared online folder where they could be viewed remotely.34These were then compared with what could be found on the HMIS/DHIS2 system which was fully accessible online.
A new component introduced to the DQA in 2021 was the assessment of the DHIS2 user interface, to assess the development of the look and feel of the system to a ‘semi-expert’ user. 35 For this exercise, we applied the user assessment to both the HMIS and the One Health Surveillance Platform (OHSP) – with the latter included to reflect Kuunika’s pivot to COVID-19 monitoring since early 2020
In addition, we used the Google Analytics system use metrics for DHIS2 supplied by Kuunika since 2017 to look at trends in use over the last five years.
29 Since the data collection instruments are sizeable documents they have not been included in this report; however, they are available from Mott MacDonald upon request.
30 For the 2019 and 2021 rounds we used the Kobotoolbox.com software.
31 Special Study 1 was authored by Dr Mwakilama who is based in Lilongwe and was able to conduct interviews face to face or by telephone.
32 In addition, over 40 KIIs were conducted for the special studies – these have informed our synthesis analysis
33 The facility-level qualitative systems assessment interviews which were a feature of the 2017 DQA were omitted because COVID-19 restrictions prevented travel by the international evaluation team.
34 See Table 4 for some limitations of this approach.
35 The assessment was conducted by a Mott MacDonald epidemiologist, who is expert in looking for and understanding health data but a novice in Malawi’s HMIS and the DHIS2 system.
The endline results
This section summarizes the triangulated endline survey results to answer the core evaluation questions and looks back to the changes since 2017. Specific findings from each endline data source are contained in Annexes 3 - 5.
Has data quality improved?
Our DQA found that of the three sampled indicators, only one was reliably available on HMIS / DHIS2, with the other two not yet reaching the digital HMIS. The data quality for Indicator 1 appears to have improved in each sample district since the baseline DQA in Quarter 4, 2016. The other two indicators in our sample have yet been included in DHIS2 despite HMIS plans to the contrary, but these were reasons outside Kuunika’s influence and control.
Indicator 1: Number of pregnant women newly testing HIV positive, performed best.
Key findings
● For the most part, data is accurately transferred from the paper ANC report forms to the online system, with 11 out of 13 facilities with available forms being equal in the paper and online versions (85% accuracy).36
● Interviewees consistently suggested that the ANC report form is the most reliable source of data for this metric on DHIS2
● For the ANC Report form on DHIS2, 12 (n=15) facilities had all months complete (80% completeness), all facilities had records available for all three months (100% availability), and 13 (n=15) facilities entered all reports by the deadline each month (87% timeliness).
● When compared to the baseline assessment (and keeping in mind definitions were different due to the paper-based versus digital system assessment), the data quality dimensions for Indicator 1 appear to have improved in each district between baseline (paper-based) and endline (DHIS2) (see Chart 1). An additional note is that the endline DQA data were submitted during the COVID-19 pandemic (Q1, 2021), demonstrating that the DHIS2 system is being consistently used as intended even during a public health crisis. 37
Indicator 2: Number of HIV positive clients initiated on ART for the first time, was not reliably available on HMIS / DHIS2 at endline, despite the earlier plan to migrate this data onto the system.
Key findings
● It was clear from our search of dashboards, pivot tables, and Data Set Reports that this data is only sporadically entered; only one out of three sample districts had any ART data available.
● MoH staff we talked to referred us to DHA-MIS for comprehensive ART data.
● Interviews revealed that HMIS officers at the district are aware of DHA-MIS and, therefore, do not spend time duplicating this work by entering the paper reports from facilities onto DHIS2; they understand that the ART data in DHA-MIS has been verified with an established process of validating data at each facility and is widely known to be reliable.
● The Department of HIV/AIDS has been working on an interoperability link between DHA-MIS and DHIS2 so that DHA-MIS data can be automatically transferred to DHIS2, but this is not yet complete. The interoperability link is expected to solve the DHIS2 ART data quality issue
Indicator 3: Number of nurses posted at the facility was not available on HMIS / DHIS2 at endline. This was despite the fact that linking human resources for health databases to DHIS2 was seen as a priority at baseline.
Key findings
● Similar to our findings at baseline, there is no nationally consistent system for recording staff posted at facilities - with a proliferation of different paper-based and electronic recording forms found at different facilities.
● The former web-based human resource system, Integrated Human Resources Information System (iHRIS), has not been migrated onto DHIS2 as intended. Rather, this was stopped due to a policy change that decentralized human resources for health data systems.
● It was learned that in 2021 MoH released an official implementation plan to revive iHRIS. Once resources have been secured, this include development of an interoperability link to DHIS2.
● In summary, we found minimal change in the reporting of human resource data since baseline. However, this appears to be to delays in reviving and integrating iHRIS – these factors are outside the control of Kuunika.
Loading times: HMIS / DHIS2 web-page loading times have remained stable since early improvements in 2017-18.
Key finding
As shown in Chart 2 below, overall HMIS/DHIS2 web-page loading times - a measure of quality in terms of the availability of the digital data - have remained stable since early improvements in 2017-18
Chart 2: HIMIS / DHIS2 average loading time, Nov 2017 - Sep 2021
Source: Google Analytics, CMED
HMIS / DHIS2 dashboards: The HMIS / DHIS2 dashboard and applications (apps) generally have positive data use attributes
Key finding
● Turning to the quality of the digital system as a whole, we found clear visual design of the dashboards and useful (if not always obvious) pivot tables However, our user assessment identified a few issues with consistency and data labelling which could be ameliorated with some simple improvements.38
● Similarly, we found the OHSP / DHIS2 site created by LIN / Kuunika to house the COVID-19 monitoring data included some apps that were potentially useful for experienced users, but there were some omissions in data labelling and metadata. There is not currently enough data entered into the OHSP site for it to display useful results, but once data is entered regularly, there is potential for this to be a highly useful platform for visualising pandemicrelated data; it was also found to be user-friendly in its design.
Has data use increased?
Use of HMIS / DHIS2 data: The number of monthly users of HMIS / DHIS2 data repository has been increasing over time
Key finding
● As shown in Chart 3 below, there has been a steady increase in the number of users accessing the HMIS / DHIS2 database on a monthly basis since baseline. There were approximately twice the number of unique monthly users in September 2021 as in September 2018.39
38 See Annex 3 for the data quality audit including the user heuristic assessment. 39 Note that users include data inputters and software managers
Other indicators: The other indicators of use of the system show relatively little change over the same period.
Key findings
● The average number of pages viewed per visit, a proxy for the value / interest of the system for the typical viewer, was hypothesized to rise as the accessibility of DHIS2 and the ability of users to seek and apply data to their work improved. The bounce rate – the percentage of visitors who leave the site after looking at only the first page is an alternative proxy for the same result
● As shown in Charts 4 and 5 below, the limited change in both indicators could be due to the fact that the two data elements we selected ended up on the same dashboard view, so could require just one page view.
● However, findings regarding limited change in systems / data use were consistent with responses from the key informant interviews and focus group discussions These suggested the HMIS / DHIS2 data viewed may still be of limited value to target users.
Have attitudes and practices around decision-making improved?
Sub-national user engagement: There was little sign that perceptions of ‘vertical data extraction’ have changed at sub-national levels since baseline
Key findings
● Despite multiple examples given of Kuunika engagement with the districts, interviews at subnational level suggested most respondents felt they were primarily seen as sources of data, rather than as collaborators in designing digital systems for use in service delivery.
● Interview and FGD respondents indicated that the new COVID-19 monitoring processes perpetuate this perception, with both district and facility-level staff concerned that they had not been consulted in the design or use of the monitoring system – although they acknowledged the system set-up was necessarily urgent and fast-tracked.
● Kuunika reported that numerous data reviews and supervisions were conducted in relation to OHSP roll-out. However, some of our FGD participants reported that their regular data review meetings had been cancelled, and supportive supervision of data use had reduced during the COVID-19 emergency.
Data quality and use at sub-national levels: There is some evidence of improved perceptions of data quality; but evidence of improved data use practices is mixed
Key findings
● Evidence from the KAP surveys provided some evidence of improving perceptions of data quality in the information systems over time. For example, by endline, 58% of respondents reported that they considered the health information they accessed to be very accurate, compared to 24% at baseline (Chart 6). Similar findings were found for perceptions of data timeliness and completeness (Annex 4).
Chart 6: Perceptions of data accuracy from baseline to endline
Do you think the health and health related information you currently use to help you make decisions in your job is accurate?
● However, from interviews and enumerator observations, we found data management and data quality control practices at facility level to be ad hoc, with multiplying registers and summary reporting for siloed or new programs Unskilled staff were still co-opted to do data entry and reporting. DHO staff, district coordinators and data clerks have DHIS2 access, but most of our other respondents still do not (or have access, but do not use it).
● The KAP survey pointed to regular data use for reviewing patient records and stock balances, although there appeared to have been an overall decline in the frequency of data use practices at endline (see Annex 4).
● A large number of respondents (from across data sources) reported use of the public COVID-19 website and use of informal WhatsApp groups to share updates This could suggest that health worker peer and personal practices around data use have been boosted. As at midline, ‘data super-users’ still appear to be useful and active at facility-level in this regard.
Have key [HIV] service areas improved as a result?
Contributions to intended HIV outcomes: In light of the project pivots and adaptations, the evidence base for Kuunika’s contributions to intended HIV service delivery outcomes remains fragmented and inconclusive.
Key findings
● Based on our evaluation evidence and review of secondary data, we can only note the following few references to HIV or other services. At this stage, we cannot infer that they are due to any changes in data use or data systems in general, or actions by Kuunika in particular.
● During the pandemic, some facilities have increased the prescription of ART per patient visit to six months. This is intended to reduce the numbers of people attending health centres.
● Loss to follow-up is expected by some to have increased due to people not coming to clinics during the COVID-19 crisis.
● No respondents were aware of any data on HIV prevention activities or reporting of nonbiomedical interventions (e.g. in the Local Authority HIV and AIDS Reporting System (LAHARS) that could be attributed to the project
The Special Study findings
The section summarizes the key findings of the ‘deep dives’ special studies. These studies aimed to understand the implications of key issues and events during project implementation, with the ultimate aim of informing a revised theory of change. The special studies focused on the pivot to COVID-19, digital health governance and the role of Kuunika at district level.
Methodological approach
The methodology for all three studies consisted of a desk review of research literature and project documents, key informant interviews and a briefing discussion with the Kuunika team. Special Study 3, on the Districts, benefitted from focus group discussions carried out as part of the endline surveys. Limitations of the studies (e.g. remote data collection and a small number of key informants available for interview) are acknowledged. These limitations were partly mitigated by intensive triangulation against secondary data sources.
Special Study 1: Kuunika, COVID-19 monitoring and the digital surge
This study looks at the impact of the COVID-19 monitoring needs in Malawi and asks whether it has led to a similar ‘digital surge’ in the health sector as seen in other countries. It explores how Kuunika has contributed to this and what it could do to sustain this effect.
Key findings
● Digitalization as a lynchpin for responding to the pandemic. Recognizing that technology would be key to monitoring and containing the pandemic, MoH’s Public Health Institute of Malawi (PHIM) engaged Kuunika to develop a solution. Kuunika worked with MoH governing bodies and stakeholders and moved quickly to develop and deploy LIN/UNICEF’s integrated disease surveillance and response (IDSR) system (at that point still at concept stage), modelling it around the WHO COVID-19 toolkit to create the One Health Surveillance Platform (OHSP) on DHIS2. The adapted OHSP was used for monitoring and reporting on COVID-19 in the districts and, for the first time, at ports of entry.
● Digital solutions developed by Kuunika. To meet the data needs of MoH in responding to the pandemic, the Kuunika project led on delivery of the following suite of tools and support:
OHSP, used by District Health Surveillance Teams, Health Facility IDSR Coordinators, data clerks and ports of entry across the country. Features include patient screening, patient tracking and follow-up, contact tracing, case management and laboratory sample tracking
Standardized forms/ tools for reporting on COVID-19
– A COVID-19 website for PHIM
– A national information dashboard
– Situation reporting linked to the PHIM website
– Setting-up of the public health emergency operations centre at PHIM
– Electronic vaccine certificates
– High-frequency data collection tools
– Mass gathering model
– Interactive epidemiological model
– Incident management system
Community applications (WhatsApp, Chatbot, USSD platform, SMS platform).
● The Kuunika team engaged with MoH and the Pandemic Task Force (PTF) to ensure that the products met their needs. Development of OHSP was finalized swiftly and its first deployment took place in April 2020, about a month after the country reported the first cases of COVID-19.
● A consultation and requirements gathering process. Evidence suggests that requirements for the digital solutions that were developed largely came from senior officials in MoH assigned to work with the Kuunika team. However, most OHSP system users to whom we spoke indicated they were not consulted in this process which might explain why both data inputters and data users faced some problems when accessing the system. They felt the system did not fully reflect how things worked on the ground (e.g. actual patient and traveller flows). Workflows were changing rapidly as the emergency evolved and while Kuunika staff did go out for site visits to update tools, they were often soon be out of date.
● District officials were sometimes dissatisfied that they were not able to use OHSP to check or update district data, since the system only shows aggregated national totals Further, system requirements were captured through meeting resolutions or minutes instead of the standard System Specification Interface document.40 This is understandable given the urgency of the pandemic situation, but needs to be acknowledged and avoided in future.
● With limited initial funding and devices, OHSP deployment took a staggered approach, rightly prioritizing the high-burden localities, cities and districts. Existing MoH partners, including BMGF, UNICEF, CHAI, WHO, USAID and MSH were brought together and organized by Kuunika to provide support with servers, tablets, phones and funding. This rare example of cooperation between digital partners in actual implementation to address priority infrastructure and deployment needs is an important achievement. It is also an example of donor coordination that could be the basis of future practice.
● Functionality of the system has remained problematic with most of users (61%) indicating that the system was not functional at all times, either due to connectivity issues (see below) or due to system updates that created problems for system log-in, or use of new forms / features (for which users had not been orientated) 41
● There remain significant challenges with internet connectivity with only just over half of users (53%) stating they have connectivity at all times, while the rest face problems. Although the mobile operator TNM offered free data bundles to the Ministry for use by health workers, in places with no TNM coverage, health workers could not benefit from this A majority of health workers (70%) surveyed were using personal resources to purchase data bundles which was problematic.42
● Gaps in data affecting reports and other outputs. Some gaps in data sent to central-level servers are likely to affect the quality of reporting and the production of other outputs. For instance, e-vaccine certificates could not be produced for individuals whose data was not entered in OHSP following the manual capture of data. This was reportedly due to devices often not syncing after data had been entered, due to the lack of data bundles, slow internet connections, or other connectivity failures.43
● Systems support post-deployment has primarily been done remotely by the Kuunika team since there appears to be no permanent solution to the needs for system technicians in health facilities and districts to provide the necessary support. Within MoH and the districts, the Department of e-Government has technicians providing IT support to government institutions, including District Health Offices. Kuunika reports that this technical support was
40 The System Specification Interface document is usually standard practice since it becomes the point of reference during the development and post development reviews and records
41 Source: Finding from additional questions included in the endline KAP survey, and qualitative enquiries
42 Source: Finding from additional questions including in the endline KAP survey.
43 Kuunika reports that this problem had now been addressed.
used during the height of the OHSP roll-out. Consideration should, therefore, be given to building on this experience and formalizing it to provide timely systems support at district level. This gap could also explain some of the challenges which OHSP users were facing regarding systems support.44
● Direct involvement and / or use of the COVID-19 digital solutions has primarily been limited to certain cadres and individuals in health facilities and ports of entry. In several health centres and ports of entry, only one focal person (e.g. HSA) was involved Similarly, at district level, only the IDSR focal point or DEHO were involved. Those not involved had no access to this data until after it was packaged and presented by PHIM on their website, and through daily updates released by PTF to the general public. While this was considered unsatisfactory, other key informants argue that the unprecedented speed and regularity (sometimes daily) with which data was analyzed and fed back to districts represents a major shift in real-time data transparency and availability.
● Capacity building on the use of OHSP. The Kuunika team built the capacity of MoH staff at district level on the use of the OHSP system, including data capturing, synchronizing and use. However, there is no evidence of sustainable capacity yet built locally (i.e. district or health centres) on the technical side of digital solutions, including their maintenance and scale-up
● Politics in the digital health space. Kuunika and the Digital Health Division supported MoH leadership and supported the national COVID-19 coordination effort to great effect, successfully aligning resources and designing systems that DHIS2. However, the special study enquiry found that other Implementation Partners continue to promote their own solutions and research tools in the districts, with some evidence that this ‘duplication of effort’ is overwhelming the healthcare workers, and even causing them to abandon government reporting systems.
● Lack of clarity on ownership and maintenance of mobile devices: There is no clear policy on the ownership, responsibility and maintenance of mobile devices used for uploading data at points of care. This can lead to ambiguity about respective responsibilities regarding faulty or lost devices. Additionally, when mobile devices are personalized, this can restrict usage by other officers.
Figure 8 below provides a diagrammatic overview of lessons learnt from Special Study 1
Special Study 2: Kuunika – the Regulatory & Policy Essentials
This study examines the role of Kuunika in strengthening digital health governance in Malawi, with particular reference to intended deliverables (e.g. the Demographic Data Exchange), implementation and aid effectiveness issues; and the policy and regulatory environment
Key findings
A changing context for governance and system deliverables
● Shifts in the wider operational ecosystem: Since the initial design phases of Kuunika, there have been significant ‘shifts’ in the digital health and operational ecosystem that have disrupted implementation. These ‘shifts’ have included: technological advances and changes in the Malawi’s wider digital landscape; shifts in donor EMR investments; changes in political and institutional leadership; reconfiguration of the Kuunika consortium, and a shift in Kuunika’s focus to the Digital Health Division. In addition, the global COVID-19 pandemic has presented both challenges and opportunities.
● Contributions to the Expanded Health Data Exchange: Despite this changing ecosystem, Kuunika has made significant contributions to the Expanded Health Data Exchange and related system components in keeping with its strategic ‘building blocks’ approach. These contributions have included registry services (such as the Terminology Service and Facility Registry) and domain services (such as the Logistics Management Information Service and the Health Management Information Service).
● Advancing a new digital health vision: Since 2019, Kuunika has supported development of a National Digital Health Strategy, 2020-2025. Alongside this, Kuunika’s role in the national Digital Health Division has involved consensus-building on an updated vision for the digital health architecture based on a modular OpenHIE framework. Under this vision, the Demographic Data Exchange (DDE) feature would over time be replaced by a master Client Registry and an Electronic Health Record (EHR) system Kuunika is now systematically advancing realization of this digital health vision through collaborative working, while keeping government in the ‘driving seat’.
Role of implementation & aid effectiveness issues
● A fragmented digital health landscape: Successive situation analyses have shown Malawi’s digital health landscape is / remains highly fragmented.45Available evidence suggests digital health investments primarily serve the disease focus of donors and their reporting systems. Desk review evidence indicates that, among donors and government, there has also been a lack of harmonized Supplier procurement, weak digital health curation and, hitherto, lack of buy-in to a shared interoperability infrastructure.
● New opportunities: Kuunika’s role in the Digital Health Division is generally seen as positive for improving the technical leadership, capacity and credibility of government. Key informants suggest the Digital Health Strategy, 2020-2025, along with mapping of a coherent digital health architecture (based on the OpenHIE framework), and contributions to key building blocks (such as the Interoperability Layer) also create opportunities for improved partner collaboration based on aligned digital health investments
● Some concerns: Although Kuunika’s role in the Digital Health Division is widely seen as catalytic, there are some concerns about a potential conflict of interest if Kuunika partners continue to compete for implementation resources. Some observers suggest Kuunika’s role should be translated into a formal capacity development plan and exit strategy to ensure sustainability is not compromised by ‘capacity substitution’.
Role of intellectual property regulation, data privacy & governance standards
● The Department of e-Governance under the Ministry of Information and Communications Technology (ICT) leads on the development of ICT legislation, policies, strategies and standards for the Government of Malawi. The Department of e-Governance seconds specialist staff to the MoH’s ICT Section and has particular responsibility for guiding digital health governance at MoH. Key informants suggest there is scope for closer collaboration between the Digital Health Division and the Department of e-Governance at national and sub-national levels – especially as digital health initiatives are scaled up.
● Key digital legislation and policies are in place in Malawi. These include the eTransactions and Cyber Security Act (2016); the Access to Information Act (2017), the National Registration Act (2009), the National ICT Policy (2013) and, for MoH, the National Health Information Systems Policy (2015). Malawi is also a signatory to several international and regional instruments for protecting data privacy. In addition, the new Data Protection and Privacy Bill, 2021, aims to provide a comprehensive legislative framework for the protection and security of personal data – although there are stakeholder concerns about the ambiguity of some terminology, the autonomy / capacity of key regulatory authorities, and implications for the National Registration and Identification System (NRIS).
● Intellectual property regulation: Malawi has five main intellectual property laws covering trademarks, patents and copyright. Existing legislation and policies also cover issues of data sovereignty and the transfer of data across borders. Key informants suggest this legislation is rarely harnessed restrict data reporting to donors, although the requirement of national health research ethics committees (e.g. the National Health Sciences Research Committee) are increasingly stringent.
● Standard Operating Procedures: In keeping with Kuunika’s listed deliverables, 12 SOPs relating to digital health governance are in progress. Drafts are in place for nine key SOPs, but three important SOPs on data security, data review meetings and quality management approaches have not yet been drafted. Drafted SOPs have been awaiting finalisation and ratification since 2020.
● Governance capacity in the Digital Health Division: Governance positions on policy and standards compliance in Digital Health Division have been identified, but these positions have not yet been filled.
Special Study 3: Kuunika and the Districts
This study examines the extent to which the Kuunika project may have responded to, or influenced, decentralization processes and outcomes. It considers whether and how the project has supported health service delivery at the district level, and whether decentralization processes have had any bearing on such work.
Key findings
● Decentralization of the health system and digital data is beyond the control of any single project, including Kuunika Information is that there are currently (mid-2021 onwards) changes being made at national government level specific to what might be seen as a (re) centralization of digital health data management and (re) focus on DHIS2.
● The partial, piecemeal and stop-start implementation of the government policy on decentralization is sometimes perceived as a cause of poor governance in the health sector Governance challenges are a significant barrier to achieving a more effective and equitable health system in three key domains: accountability (enforceability; answerability; stakeholder-led initiatives); health resource management (healthcare financing; drug supply); influence in decision-making (unequal power; stakeholder engagement).
● Districts are the ‘missing middle’ in many aspect of government and donor / partner engagement with the health sector in Malawi. In recent years, there has been an increased focus on community engagement (not least for equity considerations) Districts remain the main entry point for the majority of health services delivered to Malawians.
● When Kuunika was being designed, the Government of Malawi (GoM) was proposing to decentralize the health system; however, most decisions and human resource management have remained at the central level. There is a centralized ‘push system’ in health. Most donor projects at district level have had to work very closely with the MoH, regardless of a primary focus on districts – this may have been the case for Kuunika too.
● The institutional arrangements for the Digital Health Division could determine its level of impact at the district level These institutional arrangements are underpinned by competing political perspectives and alliances However, we note some stakeholders believe the current location of the Digital Health Division under the MoH Directorate of Planning and Policy Development has implications for the strategies the Division can adopt – especially for embedding digital health solutions to improve data quality and use for decision-making at district and facility levels.
● Cumulative evidence from successive Kuunika evaluations points to the need for greater genuine ownership of data at the district level and effective use of DHIS2 as a platform for more effective evidence-based planning. This necessitates buy-in from all partners working national and sub-national levels. However, the donor landscape has remained fragmented; this, in turn, has had implications for effective coordination of digital health initiatives. District Health Offices should ideally be integral partners in dialogue with donors on digital health solutions for data collection and use
● Kuunika has invested a great deal of time, effort and resources in district-level capacity development, including training on and access to DHIS2 and digital data hardware and systems (e.g. dashboards, the mobile App, Cluster meetings). This was most apparent before 2019, at which point changes in consortium partners and project management, coupled with the sustainability pivot, led to Kuunika being perceived as more distant from the districts (except in Zomba). More recent support to horizontal engagement (e.g. through support for Cluster meetings), is often not recognized as having Kuunika inputs. Nevertheless, these initiatives are reported to be useful channels for debate and processes of decision-making.
● Despite an explicit district-level focus in the early phases, Kuunika appears to have given relatively limited attention to support to decentralized structures in the sustainability phase The temporary involvement of the districts in Kuunika planning after the first pivot appears to have been short-term. At this stage, Kuunika might most usefully be regarded as sitting in the ‘functions and capabilities’ space – with its contribution to decentralization of health services lying in its ability to empower districts through functional district health systems that improve access to quality data for decision-making.
● The decision to focus on HIV as the data use case, meant there was little space for the district level to engage as an equal partner in management and use of digital data. This was largely because the HIV and AIDS program is highly vertical, with key datasets collected and managed by the Department of Nutrition, HIV and AIDS (DNHA) and not routinely uploaded to DHIS2. Indeed, at one time the Kuunika Project fell under the Department of HIV and AIDS, so operated within this vertical (somewhat “parallel”) structure
● Initially the primary focus of Kuunika was data systems (including improved data quality and data use). However, the emphasis on the HIV use case meant there was a particular focus on patient outcomes and use of data to support comprehensive HIV service delivery.46If fully embraced going forward, this approach would fundamentally require full engagement of districts to ensure sustainable improvements in service delivery
● From the endline survey and DQA analysis, we found little evidence of sustained or systemic improvements in data use for health sector decision making at district level Similarly, from the desk review, there was little evidence that key planning and service delivery documents (such as District Implementation Plans) are being progressively informed by quality data derived from DHIS2.
● At midline and endline, we found that there are some DHIS2 ‘data super users’ at district and facility level Data super users are highly able and motivated individuals who are exceptionally engaged and active in data use to inform professional decision-making These data super users have the potential to become champions of change.47 However, the question remains about how to maximize this potential without placing an unrealistic burden on individuals.
● The Blantyre Prevention Strategy provides a potential model for best practice Since 2020, BMGF has funded the Blantyre Prevention Strategy. This strategy was co-developed with local and national government and a consortium of partners and supports the development of an optimal system for the sustained prevention of HIV infection that is fully embedded in local structures, and led by the District Health Officer. Stakeholders report that the design of the strategy was informed by lessons from the Kuunika experience: “we re-thought the process, based on Kuunika challenges with decentralization.”
Figure 10 below provides an overview of key lessons identified from Special Study 3
46 This was in keeping with Malawi’s commitment to achieving UNAIDS 90-90-90 targets and cascade of care approaches. See: Avert. (2020). HIV and AIDS in Malawi. Available at: https://www.avert.org/professionals/hiv-around-world/sub-saharan-africa/malawi
47 Shea, C. M. et al. (2016). Quality improvement teams, super-users, and nurse champions: a recipe for meaningful use? Journal of the American Medical Informatics Association, Volume 23, Issue 6, Nov 2016, pp. 1195–1198,
ANNEXES
Annex 1: Phase 1 Theory of Changeevidence and hypotheses
Rapid review of the evidence base
We found that the rationale for the design of Kuunika and its goals was supported by both research and ‘experiential’ evidence The evidence confirms that new information technology creates the potential for the greater use of data by reducing the time and cost of doing so (Yost et al. 2014; Nutley et al. 2013; Peirson et al. 2012; Dobbins et al. 2009). It can also increase the return to users’ time by improving data quality: new IT makes more data much more quickly available, more error free and more easily manipulated and combined (Rowe et al. 2010; Mutale et al. 2013; Mate et al. 2009; Jones et al. 2013; Nutley et al. 2007). These properties, when evident and consistently available, can drive a process of individual ‘self-directed learning’ and spontaneous increased use (Knowles 1975; Tough 1967).
However, there are often internal institutional barriers which limit the potential for change (Orton et al. 2011). Social theories of behaviour change suggest learning is a process of social interactions between actors within a network’ (Reed et al. 2010). External, societal pressures to drive change (especially in public sector organisations) may therefore also be necessary if it is to be embedded and sustained. This suggests the adoption of IT and the use of the data it makes available/accessible on an organisational level requires more than a sound infrastructure and strong individuals’ skill development – it also involves changes at interpersonal, organisational, and institutional levels (Orton et al. 2011; Armstrong et al. 2013). These may not be so amenable to purely technical solutions. New models and definitions of organisational ‘capacity’ emphasise that it consists of ‘a multi-layered set of processes’, which are about more than skills and technology: capacity development is complex and multi-dimensional (Ubels & Fowler 2010; Baser & Morgan 2008), requiring an understanding of whole systems, and interventions at different levels of these systems.
Hypotheses underpinning the theory of change
We articulated a set of over 20 hypotheses and foundational assumptions found in the literature for which there was strong enough evidence to suggest they should be considered by the evaluators as factors that could influence Kuunika’s contribution and outcomes. These hypotheses / assumptions are summarised in the table below.48
The process of creating and retro-fitting a theory of change threw up two significant questions for the evaluation team about the original Kuunika design:
● Not all of the many planned Kuunika activities and investments were clearly located on a results chain or causal pathway.
● A significant body of research literature identifies an intermediate stage (or on-going process) of organisational culture change which is necessary to translate more and better data into better decision-making and outcomes. This process of organisational change has not been fully articulated or addressed by Kuunika.
The theory of change we created in 2017 added a new intermediate outcome of organisational change, for which the planned training and mentoring inputs were necessary but likely not sufficient. Again, the evaluation aimed to articulate just how – and how well - Kuunika influenced this hypothesised process of organisational change.
Table 6: Hypotheses and assumptions for the initial Kuunika theory of change
Hypotheses and Assumptions
1. HMIS systems management procedures, including availability of material resources for HMIS and supervision for HMIS within the past six months, are associated with better data quality.
2. There are important planning and service delivery advantages to using routine/ administrative data (instead of survey data) – e.g. the power to describe all administrative levels in the country, near-real time accessibility, and reduced cost.
3. A lack of trust in the quality of admin data discourages its use
Source
Ahanhanzo YG, et al 2014
Hung, Y.W. et. al. 2020 (updated)
Mutale W, 2013; Mate K, et al. 2009
4. Data Users will have the cognitive skills to transform ‘data’ into ‘knowledge’ Reynolds, M 2016
5. The degree of organisational and political decentralisation can affect use of evidence in decision making.
6. Individual beliefs, attitudes and motivations to use data and evidence are connected to pre-existing norms and values that prevail across organisations or wider society
7. There are formal and informal ‘policy/practice making cycles’: the formal policy making cycle uses peer reviewed research evidence and debate to inform collective understanding and promote step changes in policy; the informal policy cycle is more messy and opportunistic
8. Beliefs about what counts as ‘good’ evidence may result in data being discounted; certain evidence may be viewed as ‘unacceptable’ in particular contexts and so ignored.
9. Policy making [and the development of practice] is often messy and opportunistic, using pragmatic decisions [based on evidence emerging at the local level] by a range of actors.
10. HMIS data can directly inform practice and policy change or shift understanding and awareness of an issue.
11. Lack of trust in admin data leads to [costly, possibly contradictory] parallel M&E systems (e.g. for specific service areas or health conditions) to be established and used.
12. Organisational tools and systems designed to promote use of data in policy and practice (such as guidelines, templates and procedures for incorporating evidence into programme design) can motivate individuals to use evidence more in their day-to-day work.
13. Claims of ‘lack of time’ may be linked to organisational values and norms around data use – whether individuals are given the permission and space in their working days to spend time finding and using data
14. Data (& evidence) use is influenced by the type and nature of relationships between [individuals or institutions]
15. Hierarchical management of information or organisational silos can limit access to data and its use.. divisions of responsibilities and ‘institutional silos’ can also limit consideration of evidence.
16. Civil society may play a number of different roles in relation to data/evidence use, including putting pressure on government to use evidence, building momentum behind ideas, and bringing together different forms of knowledge.
17. Policy/practice development processes involve ‘a disorderly set of interconnections and back-and-forth’ between different groups
18. Individuals are empowered through access to data
19. Interoperability is technically and organisationally feasible
20. Clinicians do not follow patient management guidelines due to lack of time, heavy workload
21. Running two parallel systems (EMR + paper) makes health workers consider EMRs redundant and cumbersome
Liverani et al. 2013; Beck et al. 2005
BCURE Evaluation Evidence Review p37
Jones, H., 2009; Nutley et al. 2007
BCURE Evaluation Evidence p43
Jones, H., 2009.
Nutley et al. 2007
Sucich, K. 2019 (updated)
Yost et al. 2014; Nutley et al. 2013; Peirson et al. 2012; Dobbins, et al. 2009
Orton et al. 2011; Armstrong et al. 2013
Walter et al. 2005
Trostle et al. 1999
BCURE Evaluation
Evidence Review p 51
Weiss, C. H. 1979
BCURE Evaluation
Evidence Review
Evaluation Team
Barth, J. H. et al. 2016
Gadabu O. 2010
Bibliography for the theory of change
Ahanhanzo, Y. G. et al. (2015). Data quality assessment in the routine health information system: an application of the Lot Quality Assurance Sampling in Benin. Health Policy and Planning, 7, pp.837–43.
Armstrong, R. et al. (2013). Knowledge translation strategies to improve the use of evidence in public health decision making in local government: intervention design and implementation plan Implementation Sci 8, 121 (2013). https://doi.org/10.1186/1748-5908-8-121
Barth, J.H., et al. (2016). Why are clinical practice guidelines not followed? Clin Chem Lab Med. 2016 Jul 1;54(7):1133-9.
Baser, H. and P. Morgan. (2008). Capacity, Change and Performance Study Report. (ECDPM Discussion Paper 59B). Maastricht: ECDPM.
Beck, M., Asenova, D. & Dickson, G., (2005). Public Administration, Science, and Risk Assessment: A Case Study of the U.K. Bovine Spongiform Encephalopathy Crisis. Public Administration Review, 6(4), pp.396–408.
Dobbins, M. et al. (2009). A randomized controlled trial evaluating the impact of knowledge translation and exchange strategies. Implementation Sci 4, 61 (2009). https://doi.org/10.1186/1748-5908-4-61
Gadabu, O. J. et al. (2010). Is transcription of data on antiretroviral treatment from electronic to paper-based registers reliable in Malawi? Doctors Without Borders publication. Available at: https://fieldresearch.msf.org/bitstream/handle/10144/204845/Gadabu%20Data%20transcrption %20Malawi%20PHA.pdf?sequence=1&isAllowed=y
Hung, Y.W. et. al. (2020). Using routine health information data for research in low- and middleincome countries: a systematic review. BMC Health Services Research. 2020 Aug;20(1):790.
Itad. (2016). BCURE Literature Review Section 3 – What is the evidence on how to build capacity for evidence-informed policy making? Available at: https://www.itad.com/knowledgeproduct/bcure-literature-review-section-3-what-is-the-evidence-on-how-to-build-capacity-forevidence-informed-policy-making/
Jones, H. (2009). Policy-making as discourse : a review of recent knowledge-to-policy literature
A Joint IKM Emergent–ODI Working Paper No. 5 August 2009. Available at: Microsoft Word090911-ikm-working-paper-5-policy-making-as-discourse (emergentworks.net)
Knowles, M. (1975). Self-Directed Learning: A Guide for Learners and Teachers, New York: Association Press. Lighthouse Trust, 2016. https://www.worldcat.org/title/self-directed-learninga-guide-for-learners-and-teachers/oclc/231857437
Liverani, M., Hawkins, B. & Parkhurst, J.O. (2013). Political and Institutional Influences on the Use of Evidence in Public Health Policy. A Systematic Review. PLoS ONE, 8(10), p.e77404. Available at: http://dx.plos.org/10.1371/journal.pone.0077404
Mate, K.S. et al. (2009). Challenges for routine health system data management in a large public programme to prevent mother-to-child HIV transmission in South Africa. PLoS ONE, 4(5), pp.1
6. Available at: https://pubmed.ncbi.nlm.nih.gov/19434234/
Mutale, W. et al. (2013). Measuring Health System Strengthening: Application of the Balanced Scorecard Approach to Rank the Baseline Performance of Three Rural Districts in Zambia. PLoS ONE, 8(3). Available at: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0058650
Nutley, S.M., Walter, I. & Davies, H.T. (2007) Using evidence: How research can inform public services. Bristol: The Policy Press.
Nutley, T., McNabb, S. & Salentine, S. (2013). Impact of a decision-support tool on decision making at the district level in Kenya. Health Research Policy and Systems, 11(1), p.34. Available at: http://health-policy-systems.biomedcentral.com/articles/10.1186/1478-4505-11-34
Reynolds, M. (2016). The Open University’s repository of research publications and other research outputs: Heuristic for teaching systems thinking. Available at: [PDF] The Open University ’ s repository of research publications and other research outputs Heuristic for teaching systems thinking | Semantic Scholar
Orton, L. et al., 2011. The use of research evidence in public health decision making processes: Systematic review. PLoS ONE, 6(7). Available at: https://pubmed.ncbi.nlm.nih.gov/21818262/
Peirson, L. et al. (2012). Building capacity for evidence informed decision making in public health: a case study of organizational change. BMC Public Health, 12(1), p.137. Available at: http://bmcpublichealth.biomedcentral.com/articles/10.1186/1471-2458-12-137
Reed, M. et al. (2010). What is Social Learning? Ecology and Society, 15(4), p.r1. Available at: http://www.ecologyandsociety.org/vol15/iss4/resp1/
Rowe, L.A. et al. (2010). Building capacity in health facility management: guiding principles for skills transfer in Liberia. Human Resources for Health, 8(1), p.5. Available at: http://humanresources-health.biomedcentral.com/articles/10.1186/1478-4491-8-5
Sucich, K. (2019). The Value of Data Trust in Healthcare Analytics. Available at: The Value of Data Trust in Healthcare Analytics - Dimensional Insight (dimins.com)
Tough, A. (1967) Learning Without a Teacher: A study of tasks and assistance during adult self-teaching projects. Available at: http://www.ieti.org/tough/books/lwt.htm
Trostle, J., Bronfman, M. & Langer, A. (1999). How do researchers influence decision-makers? Case studies of Mexican policies. Health Policy and Planning, 14(2), pp.103–14.
Ubels, J. & Fowler, A. (2010) Capacity Development in Practice. London: Routledge.
Weiss, C.H. (1980). Knowledge Creep and Decision Accretion. Science Communication, 1(3), pp.381–404.
Yost, J. et al., 2014. Tools to support evidence-informed public health decision making. BMC Public Health, 14(1), p.728. Available at: http://bmcpublichealth.biomedcentral.com/articles/10.1186/1471-2458-14-728
Annex 2: Kuunika achievements & challenges at midline
The table below summarises key Kuunika achievements and challenges identified at the independent midline evaluation.
Table 7: Summary of key Kuunika achievements and challenges at midline
Key achievements
1. In sampled districts, good evidence that end users had been trained on routine health data collection and data entry into paper registers, with on-the-job training across the board. All of the data entry respondents at health facilities with EMRs confirmed they had been trained on EMRs.
2. Good progress in design and implementation of the HMIS smartphone application (app), along with training in linking the new app and digital services to health workers’ routine work needs and procedures
3. While paper records remained by far the most popular source for looking up data in 2019, the use of DHIS2 had risen from less than one in five (18%) of all respondents to almost one in four (23%). Use of DHIS2 to enter data had increased since 2017 (baseline) by 7% to 10% of respondents. More data producers/entry users of EMRs and DHIS2 said they found it easier to enter data in 2019 compared to 2017. With regards DHIS2, 61% of those who reported accessing it at midline found it ‘fairly’ or ‘very’ easy, compared to only 15% at the 2017 baseline
4. Both paper and EMR systems seemed to be taking less time overall for data entry than at baseline. The main reason for not being able to enter data in 2019 was the absence of an internet connection or other IT system failures. The problem of lack of power (cited in 2017 as the main reason) had substantially reduced; however, the frequency of respondents reporting a problem with digital data entry in the previous month had risen.
5. There has been a general increase in ‘high frequency’ searching for health data since 2017, with almost half (47%) of all respondents reporting they look up data ‘more than once a day’, compared with under a third (30%) in 2017.
5 By midline, there had been some improvement in the verification or triangulation of EMR and paper register health data. By 2019, the Data Cluster Meetings convened by the Kuunika District M&E Officer provided an opportunity for a more representative group of data users to get together (beyond senior cadres) to assess data quality issues and how to address them.
6 By 2019, there was some evidence of a stronger and more active data culture. For example, managers requesting information at least once a week has risen from just over one in five (22%) in 2017 to over one in three in 2019. The proportion of respondents strongly agreeing that their colleagues thought data use for decision making had risen from 7% at baseline to 19% in 2019; however, more respondents identifies personal incentives and rewards as key drivers for data use than at baseline.
7 In 2019 as in 2017, we found almost complete agreement from our KAP respondents (98% and 96% respectively) that they knew where and how to find the information they needed for their job.
8 Although sharing of data between healthcare providers had shown some decline by midline, this appeared to be associated with an increased provider awareness of issues of data privacy and confidentiality
Key challenges
1. The introduction of the EMRs, with frequent system down-times, was disrupting the use of paper registers and potentially reducing the data quality – with this reported as an increasing problem at midline than at baseline. Not only were power outages and connectivity challenges affecting EMR use and causing data back-filling problems, the recent updates of the ART and ANC modules of the EMRs now contain several coding errors. At midline, no respondents yet felt confident to replace paper-based registers with EMRs.
2. Growing concerns about the limitations of the EMR technology being rolled out to target districts By 2019, there were increased user complaints of repeated EMR / data entry failures and significant backlogs of client data that needed to be ‘back-entered’. Users reported more time being spent on managing failing digital systems that was compromising their ability to provide and manage client services.
3. There was very little verification or triangulation of EMR and paper register health data by facilities in advance of sending monthly reports to the DHA (although there was some evidence of a small improvement in this by midline).
4. Use of other Kuunika target databases (to enter data) were falling compared to 2017.
5. The main reason for not being able to enter data in 2019 was now the absence of an internet connection or other IT system failures.
Annex 3: Data Quality Audit methodology and key findings
This annex describes the key features of the endline DQA (including the heuristic User Assessment) methodology and findings. The full DQA protocol and a more comprehensive account of the DQA methodology and findings are available from Mott MacDonald upon request
Description of the modified endline DQA
Due to the COVID-19 pandemic and associated travel restrictions, the endline DQA was conducted remotely. Access was given to the DHIS2 system which was fully accessible online. All relevant paper forms for the data elements (described below) from each of the 15 facilities were photographed by an enumeration team and uploaded to a shared online folder where they could be viewed remotely. The three-month period from 1 January 2021 to 31 March 2021 (first quarter 2021) was selected as the sample period.
Due to the remote nature of the evaluation, some of the components of the baseline DQA that involved facility visits could not be replicated, and so as an alternative, an in-depth ‘heuristic assessment’ of the DHIS2 website was conducted. For the quality assessment of indicators, rather than the evaluation of paper forms at facility level as conducted in the baseline DQA, the endline DQA focused on the data in DHIS2.
We have learnt that, since 2020, Kuunika has been involved in the monitoring and reporting of the COVID-19 pandemic in Malawi, creating a digital One Health Surveillance Platform (OHSP) that also sits on DHIS2. In order to capture and evaluate this activity, the endline DQA also included an assessment of the OHSP website. This assessment was less in-depth than that of the DHIS2 website as we understand it is a work in progress.
The following data was assessed as part of the endline DQA:
● A comparison of 3 health metrics, or ‘proxy indicators’, in the DHIS2 system: – Indicator 1: Number of pregnant women newly testing HIV positive
Indicator 2: Number of HIV positive clients initiated on ART for the first time
– Indicator 3: Number of nurses [posted] at the facility
● Trends in HMIS/DHIS2 user metrics from 2017 to 2021
● An ‘heuristic assessment’ of the user look and feel of the DHIS2 website
● A lighter assessment of the user look and feel of the OHSP website.
The three proxy indicators above were assessed for four dimensions of data quality –completeness, accuracy, availability, and timeliness – with definitions appropriate to DHIS2, rather than paper registers. The dimensions of data quality in the baseline and endline DQAs are described in Table 8 below.
Table 8: Definitions of the data quality dimensions for the baseline and endline DQAs
Data Quality Dimension
1. Completeness
Baseline Definition
Proportion of reports and registers viewed by the DQA that have zero empty fields
2. Accuracy Monthly totals in the registers are the same as are recorded in the monthly paper summaries at facility and district level
EMR monthly totals are the same as the monthly paper summaries at facility and district level
DHIS2/iHRIS/DHA website totals are the same as the paper summaries
3. Availability Proportion of districts’ facilities that can produce a full set of [paper] registers and reports on request
4. Timeliness All dates on which activities took place are recorded in the register
Proportion of reports submitted by the deadline to the DHMO as indicated by a date stamp
Endline Definition
Proportion of DHIS2 Data Set Report pages viewed by the DQA that have zero empty fields
Proportion of DHIS2 website totals that are the same as in the paper registers or reports
Proportion of facilities that have data available for the full 3-month period on DHIS2
Proportion of reports entered into DHIS2 by the deadline (15th of each month)
The dimensions of data quality were assessed with different definitions in the baseline and endline DQAs due to the in-person versus remote nature of these assessments, as well as the focus on paper systems in the baseline DQA versus the online DHIS2 system in the endline DQA. These measures are still comparable keeping in mind that the Baseline data quality measures were for the paper system and the endline data quality measures were for the digital system. In this way it was possible to assess how DHIS2 at endline was performing when compared to the former paper record-keeping system.
DQA challenges and limitations
A challenge of conducting the DQA remotely was that communication via email was less efficient than direct in-person communication. In addition, some photographs of the paper forms were blurry, cut off, or otherwise unclear and could therefore not be read. This was a problem with the ANC registers, potentially due to the large volume of pages being tedious to photograph carefully in a reasonable amount of time However, all photographed ANC monthly report forms were readable; these forms have fewer pages and were consistently photographed clearly.
Key DQA findings
Due to the absence of data for Proxy Indicators 2 and 3 on DHIS2, a baseline-endline comparison of data quality could only be conducted for Indicator 1. The following account presents key findings for Indicator 1 (number of pregnant women newly testing HIV positive). More information explaining the absence of Indicators 2 and 3 is available upon request.
Indicator 1: Number of Pregnant Women newly testing HIV Positive
The DHIS2 data repository sources data on Indicator 1 from two parallel reporting systems: the paper ANC Monthly Facility Report and the HMIS 15 system. DHIS2 dashboards reflect each data source.
DQA dimension: completeness
Our DQA assessment found that facility-level completeness (proportion of facilities with all months complete) overall was 12/15 (80%) for the ANC report form, and 7/15 (47%) for the HMIS 15 form. The ANC report appeared to be the more reliably complete form on DHIS2. The chart below shows differences in completeness of DHIS2 forms by district for Quarter 1, 2021 for the three time series analysis districts.
Facility-level completeness assessment by form and district, Indicator 1, Q1 2021
DQA dimension: accuracy
The results for number of pregnant women newly testing HIV positive were generally accurate when comparing ANC monthly report paper and DHIS2 numbers, and inaccurate when comparing ANC and HMIS 15 numbers on DHIS2. It seemed that the data was transferred from the paper ANC report forms to the online system reliably for the most part, with 11 out of 13 facilities with available forms being equal in the paper and online versions (85% accuracy). The chart below shows facility-level accuracy data for the ANC report in paper and online versions for Quarter 1, 2021. When comparing the two different sources of this data element in DHIS2 –ANC and HMIS 15 – the results were very inaccurate, with only 1 of the 15 facilities having equal values for both forms (7% accuracy).
Facility-level accuracy assessment, Indicator 1, Q1 2021
DQA dimension: availability
On the DHIS2 system, 10 out of the 15 facilities had HMIS 15 records available for all 3 months, and 5 facilities were missing at least one month (67% facility level availability). For the ANC reports in the DHIS2 system, all facilities had records for all 3 months (100% facility level availability). As shown in the chart below, the ANC report appeared to be the more reliably available form on DHIS2.
Facility-level availability assessment by form and district, Indicator 1, Q1 2021
ANC Report HMIS 15
DQA dimension: timeliness
Facility level timeliness (proportion of facilities to enter all reports by the deadline each month) overall was 13/15 (87%) for the ANC report form, and 8/15 (53%) for the HMIS 15 form. As shown in the chart below, the ANC report form appeared to be entered on time into DHIS2 more reliably than the HMIS 15 form.
Facility-level timeliness assessment, Indicator 1, Q1 2021
ANC Report HMIS 15
Comparison against the Baseline DQA
To compare the four dimensions of data quality in the baseline and endline DQAs, data from the [more reliable] ANC report form was used for the endline DQA. The chart below shows the baseline and endline comparisons for Indicator 1 against the four data quality dimensions. Keeping in mind the different definitions, the data quality for Indicator 1 appears to have shown
improvements against some dimensions in each district from the Baseline (paper-based) to the endline (DHIS2) DQAs.
Data quality dimensions at facility-level: Baseline and Endline comparison for Indicator 1
Completeness: Baseline
Completeness: Endline
Accuracy: Baseline
Accuracy: Endline
Availability: Baseline
Availability: Endline
Timeliness: Baseline
Timeliness: Endline
HMIS/DHIS2 User Metrics Trends since 2017
The following user metrics for the HMIS / DHIS2 website were examined from 2017 to 2021:
Table 9: User metric definitions for the endline DQA of the HMIS / DHIS2 system
User Metric Definition Source
Monthly total users
Average number of pages viewed per visit
Bounce rate
Average page load time
The total of unique new and returning users to any page of HMIS/DHIS2 per calendar month
The total number of pages viewed divided by the total number of sessions
The proportion of visits to the site which view only the first page before leaving
The time taken for a page to fully load and become useable
Google Analytics data
Google Analytics data
Google Analytics data
Google Analytics data
Key findings on user metric trends since 2017 have been included in the main body of this report.
DHIS2 Dashboard User Assessment
The endline DQA included an ‘heuristic assessment’ of the usability of the DHIS2 system by a competent first-time user. Table 10 below summarizes the findings of this assessment A similar ‘lighter touch’ assessment was completed for the OHSP which was not fully populated with data at the time of the assessment. In general, the OHSP was found to be accessible and userfriendly, although there were some minor data consistency and labelling issues (the full assessment is available from Mott MacDonald upon request).
Table 10: User assessment of the DHIS2 platform at endline (October 2021)
Website: DHIS2
Website URL: https://dhis2.health.gov.mw/dhis-web-dashboard/
Heuristic being assessed
Visibility of system status
The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.
Match between system and the real world
The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system oriented terms. Follow real-world conventions, making information appear in a natural and logical order.
User control and freedom Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Supports undo and redo.
Relevant task or task step
Searching for relevant information.
Usability description
When figures are loading, it displays a moving circle to indicate loading status.
It is not obvious when the user has been logged off. If the user has been logged off, things simply stop loading rather than a logged off message. Must refresh the page to log in again.
Evaluator’s comments on usability
The loading icon is universally recognisable (the moving circle).
It would be helpful to automatically display a message when the user has been logged out, asking them to sign back in.
Is the language used on the website familiar to its intended audience?
It is not clear what some of the metrics mean. There is no explanation for the metrics visible on the Dashboards or in the Indicator dictionary.
The acronyms do not have explanations.
It would be very helpful to have descriptions of the metrics and acronyms.
Consistency and standards
Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.
Navigation and control
Changing the filters on dashboards is fairly easy, there is no complicated undo/redo necessary.
Changing between dashboards and apps is easy. Some apps lose the app icon in the upper right corner of the screen but for these it is easy to go back or home.
The only filters which are easy to use are period and organisation unit. The others are not obvious what they mean.
Other than not displaying which information is on which dashboard, the dashboards are easy to navigate.
The pivot table app has so many options it would be difficult to navigate unless the user knew exactly what they were looking for, but it seems very useful.
Exploring the web site.
On the dashboard “HIV (By District)” when filtering by a fixed period of months or quarters, the figures subtitles still show “Last 6 months” or “Last 4 Quarters” on the graph view (see Figure 18). In the table view it says the correct period.
The dashboard “03.2 – HIV (By District)” shows national data and “03.3 – HIV (National)” shows data by district (they appear to be switched). Same with the Maternal & Child Health district and national dashboards. See Figure 19 below.
There are 2 data visualizer apps –‘Classic Data Visualizer’ and ‘Data Visualizer’.
On dashboards, filter information is not displayed in the figures, and sometimes the subtitles contradict the filters, so it is not always obvious if the data shown is for the period selected. It is easier on the table view than the figure view. It should ideally be straightforward both ways.
There appears to be a moderate level of consistency in the way data is presented between different dashboards (comparing HIV site, district, and national, and Maternal & Child Health site, district, and national).
It seems unnecessary to have 2 data visualizer apps. Perhaps it is for personal preference for more experienced users, but if most users prefer one, the other is redundant.
Error recognition, diagnosis, and recovery from errors
Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.
Recognition rather than recall
Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.
Simple and advanced search Can you get the same results?
There are 2 apps for indicator definitions – ‘Indicator Dictionary’ and ‘Indicator Search’. It seems unnecessary to have 2 indicator definition apps, particularly if neither is up to date with definitions for all indicators.
The only common error message on the Dashboards is “No data”.
On the Maps app, for any ‘Earth Engine’ layers, (precipitation, temperature, population density, etc), an error at the bottom of the screen reads “No value present: To show this layer you must first sign up for the Earth Engine service at Google. Please check the DHIS 2 documentation.”
Advanced search Instructions for use of the system are not obvious.
The ‘Indicator dictionary’ app should contain definitions of all indicators, but many of them are not descriptive enough or do not specify the denominator.
The “no data” error message could be more precise. For example, the ART registration data is available by quarters but not by months, which is not obvious without trial-and-error. If you filter by months, it says “no data”, and if you filter by quarters, data is displayed. Perhaps a regular user would know this information already.
It is taking me a lot of trial-and-error to find the information I need, since I cannot find descriptions for some of the data. For example, the time frame for “cumulative” is not specified – is it cumulative since the beginning of the year? Since the beginning of reporting on the system? The last 12 months? Without that information it is difficult to know the difference between, for example, Cumulative ART Registrations and New ART Registrations for the same time period. I checked the Indicator Dictionary app and it did not provide a useful answer.
Flexibility and efficiency of
use
Accelerators unseen by the novice user – may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.
Aesthetic and minimalist design
Dialogues should not contain information which is irrelevant or rarely needed.
Help and documentation
Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.
Does the website have any shortcuts for proficient users?
It seems that the pivot table app can be used very flexibly to display data in a variety of ways that are most convenient for the user. It contains a lot of additional data not shown on dashboards.
The pivot table app seems very useful for advanced users, even better than the dashboards for finding the desired data. With an instruction tutorial it could be used by novice users too.
General website appeal and initial impressions
Easy to read font, good contrast
Good initial impressions of website aesthetic. Data is presented in easyto-read figures on the dashboards.
How do I use the advanced search?
There are no instructions or help buttons visible on Dashboards. The information button says “No description” when clicked.
It is unclear where I can find instructions on website use. It would be helpful to have some basic instructions on how to perform common actions, including use of filters, and information on what each dashboard shows. The Indicator Dictionary and Indicator Search apps are not as useful as they should be.
Annex 4: Endline sampling map and summary of KAP survey findings
The questionnaire used for the facilitated Knowledge, Attitudes and Practices (KAP) Survey at endline, and the full findings from the survey are available from the evaluators upon request. In this annex, we include a map of the sampled districts and facilities, key findings from the endline KAP survey, and key findings from the time series comparison against the baseline and midline
Sampling map
The following map shows the location of the sampled health facilities at endline Note that sampled district hospitals are located in the main district town
Key findings from the endline survey
Findings on general characteristics of sampled respondents
The following charts capture endline KAP survey findings on: the number of respondents by district; respondent’s roles; and numbers with access to a mobile digital device. We note that nearly all respondents had access to a smartphone. Around 40% of respondents reported they were directly involved in HIV patient care.
Endline findings on entering data
The following chart shows that the majority of respondents had used the health information system (HIS) to enter data in the last month, with the majority entering data on patient details, diagnostics and aggregated information; 30% had entered data relating to COVID-19.
We found that 57% of respondents had entered data using an EMR, while 88% had entered data using paper-based methods. 68% reported that they had wanted to enter data into the HIS in the last month but were unable to. As shown in the chart below, the most common reasons related to lack of internet connectivity and the electronic system not working.
Just over 60% of respondents said they had needed help entering data into the HIS in the last month, with 96% of these respondents indicating that they had asked for help and received it. In 79% of these case, the help was provided by a workplace colleague.
As shown in the chart below, 128 respondents reported that they had received training on how to enter data into the HIS (paper and electronic formats) in the past six months. A further 79 respondents reported they had received mentoring and supervision on data entry in the past six months.
Endline findings on looking up and using data
As shown in the chart below, when respondents were asked about health information they looked up or used in the past six months, 62% said they had consulted paper-based facility records, while 40% said they had consulted the EMR system. Of those who had not looked up or used health information in the last six months, the most common reason given was that they “did not need the information”
As shown in the chart below, when asked about how frequently they had looked up or used health information in the last month, the majority (36%) said they had done so once or twice, although 24% did so more than once a day. Overall, this appears to represent a decline in frequency since midline (see next section)
In most cases, respondents who sought health information said they were looking for / using information on patient’s personal details, treatment, laboratory tests or vital signs. As shown in the chart below, the reason most often related to patient follow-up, outcomes, comparison to the past, and stock management.
However, the chart below shows that, when asked to rank the most important data source for decision-making, most respondents identified facility registers are the preferred source.
Comparisons against the baseline and midline
The following charts show key findings that lend themselves to comparison over the course of the baseline, midline and endline surveys
The following three chart show there have been improvements in user perceptions of the quality of HIS data (against the DQA criteria of accuracy, timeliness and completeness) since baseline.
Do you think the health and health related information you currently use to help you make decisions in your job is accurate?
Is the health and health related information provided regularly and in good time, to help you make decisions in your job?
Do you think the health and health related information you use to make decisions in your job is complete?
There was an increase in the percentage of users entering data using EMRs at endline Notably, other data pointed to a significant decline in users reporting that data entry was not their responsibility or job by endline. The percentage reporting they were unable to enter data in the last month when they wanted to was stable from midline to endline at around 32%.
Which registers, forms, reports, databases or systems have you used in the last month TO ENTER this information?
As shown in the chart below, there was also a significant decline in the number of users reporting they did not have the skills to enter data (by paper-based or electronic methods) by endline.
Why could you not enter health or health related information in to the register, form, report, system or database [enumerator, take your cue from the respondent and just say the relevant one] you wanted to?
The reasons given for looking up / using HIS data has remained consistent since baseline (mostly about reviewing patient details, diagnostics, treatment and stock management). But, as shown in the chart below, the frequency of looking up and using HIS data has shown an overall decline over time.
Thinking about THE LAST MONTH, how frequently on average did you look up or use information from these information sources for your work?
Annex 5: Interview methodology and key findings
This annex provides an overview of the methodology applied for the endline KIIs and FGDs. It also presents the qualitative findings by core evaluation question. The specific interview schedules / topic guides used for interviews, along with disaggregated matrices of qualitative findings are available from Mott MacDonald upon request. More detailed qualitative findings for Part 1 of the synthesis report and the special studies are also available upon request.
The interview methodology
Qualitative data collection was conducted between late August and early December 2021.
The qualitative findings presented here are based on triangulated data collection from literature review, key informant interviews (KII) and focus group discussions (FGD), linked to contribution analysis and examination of the Kuunika evaluation Theory of Change and hypotheses. Therefore, the results presented below for the core evaluation questions 1 - 4 are grounded in information gathered from key project, health sector and partner stakeholders.
Table 11: Overview of interview sampling
The matrices for the three broad categories of key informant interviews and focus group discussions (health facility staff, District Health Office and District Council staff and national/partner) are available as raw data from Mott MacDonald – these matrices represent the basis for analysis.
Limitations of the qualitative methodology: These included restrictions on travel due to the pandemic and the resultant challenges of conducting remote interviews through Microsoft Teams and Zoom. Connectively problems led to several missed KIIs. Travel restrictions also meant it was not possible to undertake dedicated KII or FGD with Health Surveillance Assistants (HSA) / Disease Control Surveillance Assistants (DCSA) or with any representatives of Health Facility, Village or other category of health committee – note this had been possible during the baseline and midline evaluations.
Summary of key qualitative findings by evaluation question
The following summary include indicative quotes and examples only. These have been selected to illustrate the evaluators process of contribution analysis and testing of the project theory of change and hypotheses (Annex 1) The full qualitative matrices available from Mott MacDonald contain more comprehensive quotes and examples. The following findings have been organized by evaluation question.
Q1. Has data quality improved?
Key findings were as follows:
● Insufficient training and supervision: A number of Data Clerks stated that they had not received any initial or refresher training.
● Hierarchies of responsibility and collection: Several respondents at facility level indicated that there is a hierarchy involved in data entry - this is not seen as a high status job. It was clear that some health workers continue to see themselves as outside the 'data chain', in terms not only of collection, but analysis and application.
● The role of civil society in collecting and using data: As at baseline and midline, key informants acknowledged the critical role played by community organizations and Health Surveillance Assistants (now said to have been renamed Disease Control Surveillance Assistants, although HSA respondents identified themselves as such) in collecting data and supporting the management of patient access. At national level there is reference to the long awaited and still delayed LAHARS (Local Authority HIV and AIDS Reporting System), which is considered to provide opportunities for more user-friendly data collection.
● Data 'super users': As at midline, we identified some ‘data super users'. At endline these included a programme co-ordinator, a nurse and a DHO member. As illustrated by the quote below, a common emotion expressed by the super users was frustration, over lack of time, low quality of data, demands not being met by support and a desire to be able to use data more effectively to plan and to deliver services.
“It is really frustrating not to able to use data as they could and should be: we know what could be done and we see obstacles still. I would really like to use DHIS2 to plan and deliver services and support others, here and at e.g. review meetings. The demands are placed on us, while the support is inadequate. Projects come, projects go, all with slightly different priorities - and they seldom talk enough to each other and certainly not to us or the DHO.”
(Health facility staff member, Kuunika District)
● Use of EMRs and Paper Registers: At district and facility levels, respondents consistently reported that paper registers continue to be an essential part of the collection of data. Electronic Medical Registers (whatever the view of their usefulness, completeness, etc.) were widely seen as too unreliable due to issues of power outages and lack of connectivity for any health facility to contemplate abandoning the use of paper registers. Respondents were, however, able to identify the positives and negatives of both systems.
● Interaction with DHIS2: Respondents at facility level consistently reported that access to DHIS2 continues to be limited. Key barriers mentioned were the need for password access, the volume of data available and challenges in navigating the system. National level key informants reported that a key issue remains how to best bring together the paper and EMR systems, and combine the data in ways that are user-friendly and accessible.
“There's also the sheer volume of data - DHIS2 can overwhelm users. There isn't always a correlation between quantity and quality of data.” (National-level respondent)
● Perceptions of data quality at subnational levels:
– Data completeness: Health facility respondents consistently referred to gaps in routine data collection systems (e.g. on ART information) and the need to the need to retrospectively collect and reconcile the data.
“At ANC, entering data on [the EMR] - we abandoned it, because it was incomplete and poor. When women went for HIV testing, this didn't show in the [EMR] system, so we went back to the paper-based module.” (Health facility staff member, Kuunika District)
Data accuracy: Several respondents referred to issues of human error and the fact that errors can be made in data entry - not least because many Data Clerks have not been appropriately trained -especially in the use of EMRs. Key challenges mentioned included: few opportunities for quality assurance, the volume and pressure of work at facility level, and the proliferation of different reporting systems.
– Data timeliness: Several respondents at facility and district level suggested that timeliness of reporting had deteriorated in the past two years. One reason given was the increased demand for COVID-19 data. Other frequent comments were that both paper and electronic reports need to be submitted to the district office; also that additional time is needed to backfill data.
● Whether trust in data increased as a result of Kuunika support: District-level respondents in Zomba mentioned support provided by the project that was considered to have increased trust in data. There were also perceptions of a wider range of GoM and donor partners providing support and a degree of momentum to enhancement of trustworthy data.
“I feel that data reliability has been increased, in part due to Kuunika supporting the setting up of the WhatsApp group, where we can triangulate, discuss, plot, track those who haven't reported - and take action. Such responsibility feels good.” (DHO staff members, Kuunika District)
● Progress on SOPs: One national level key informant made the following observation:
“We came out with SOPs to drive the effort [to support use of digital data], but these are nowhere to be seen.” (National-level respondent)
Q2. Has data use increased?
Key findings were as follows:
● Using data at subnational levels: While some district-level officers reported some progress in data use, the general perception at facility level was that high workloads make consistent data use practices difficult.
“We have seen changes in service delivery - previously some were not able to enter details properly, now we can do that, so can review and make better use. There is faster data entry now, so that in itself improve quality of service delivery.” (Health facility staff member, Kuunika District)
“There are so many indicators and so much raw data; it is difficult for the Data Clerks to enter the mass of data and for us to look it up, make sense of it and use it. There are also so many Registers and so many demands from partners and the DHO and the DHA. So even if you're committed like me, it is hard to keep on top of the data and make best use.” (Health facility staff member, Kuunika District)
● Support for data use: Respondents at sub-national level suggested support for data use often largely hinges on intra and inter-facility meetings. While the pandemic has resulted in what appears to be the widespread halting of quarterly district M&E review meetings, a small number of health facilities (2 of the 6 visited for the FGDs) mentioned that they hold their own, internal data review meetings; one facility uses the meetings to develop monthly action plans. WhatsApp groups continue to function and are widely seen as positive, immediate and task-oriented, with individuals and health facilities able to confer at their site or more widely, to learn from each other and to solve problems within their scope.
Q3. Have attitudes and practices around decision-making improved?
Key findings were as follows:
● Development of a data use culture: One positive finding from a minority of both health facility and DHO staff member respondents is that there continues to be some development of a 'data culture', which indicates some movement with regard to improved attitudes and practices for evidence-based decision-making. The District Health Office was sometimes seen as facilitating a data use culture:
“We are guided by the DHO and also (increasingly) by the data we share and view on WhatsApp, DHIS2 + general discussions with other health workers. These have helped our health facility to be more self-sufficient in terms of planning to use HIV data - but resources always lag very far behind.” (Health facility staff members, Kuunika District)
● Dependence on data clerks: At facility level there were many references to data clerks having primary responsibility for data entry and use. For example:
“Is DHIS2 used for that? I've never used it; I rely on the Data Clerk - surely they are the ones to plan the way forward?” (Health facility staff member, Kuunika District)
“The data clerks use [DHIS2], but only one has been trained at this facility. The head of HMIS uses it most“ (Health facility staff members, Kuunika District)
● District acknowledgement of Kuunika support: Some district key informants recognised the gains from Kuunika support:
“The Department for Digital Health and Kuunika have worked very closely together. In our District the Kuunika people have been supportive in terms of DHIS2, which has helped us at the DHO with planning. Kuunika has tried to live up to its name, to support decision makers to understand data and make use of them.” (DHO staff member, Kuunika District)
● National-level perspectives on Kuunika support to the development of an evidence-based knowledge culture varied. Some key informants suggested that while more data is now available, it is not translated into decision-making, in part due to “lack of a performance management culture”. There were also concerns about: “too many partners, too many reporting formats and too many demands for data”.
Q4. Have key [HIV] service areas improved as a result?
Key findings were as follows:
● Kuunika inputs to improved service delivery: There were some perceptions that Kuunika has contributed to strengthening digital data systems across the board. A few respondents felt that such improvements are now embedded into the system.
“Those HMIS review meetings, where we look at indicators - those are useful and I think we now understand their value. Kuunika has supported us to shift towards greater use of data for planning, which can only assist service delivery improvements.” (DHO staff member, Kuunika District)
● Effects of the pandemic: A few respondent pointed out that the focus on COVID-19 data collection had resulted in reduced attention on data collection for other services. There were also concern about recent reductions in mentoring and supportive supervision support:
“I have missed mentoring during the pandemic; it encourages me to think about quality and use of data.” (Health facility staff member, non-Kuunika District)
● Sustainability of progress: Views were mixed. Zomba District respondents were the most positive, stating that while even before the pandemic the project was proactive and problem solving at both district and health facility levels, the COVID-19 response had introduced a 'push in data management, reporting and use'. Because data collectors, analysts and users have all been aware of the paramount need for timeliness, this was said to have strengthened the whole process of data use. This attitudinal and practice shift was felt to be sustainable - with continued support.
Other views were that throughout the project, the emphasis has been on data collection and data extraction. There were concerns (especially at district level) that this did not promote sustainability in terms of data quality, ownership, and practical use of data for planning and service delivery purposes. The various meetings funded by Kuunika were beginning to expand engagement and ownership; however, the view was that these need to be re-started and sustained for the foreseeable future.
Annex 6: Lessons for evaluation methodologies
This annex reflects on the original impact evaluation design and its fitness for purpose in capturing (i) the changing activities and aims of Kuunika over its life and (ii) distilling the lessons learnt.
● The mixed-methods approach (quantitative and qualitative) remains valid. The mixedmethod evaluation approach combines simple and intuitive quantification of progress, triangulating and explaining that with deeper qualitative assessment. The complexity and multi-faceted nature of digital innovation in low resource settings requires a depth and variety of measurement which cannot be satisfactorily evaluated in another way.
● A theory-based and contribution analysis approach to the evaluation design and implementation remains valid too. This approach can accommodate project complexity and adaptation, but provides a transparent framework for selecting the mixture of methods, weighing and synthesising results. It also enables and invites participation and discussion that promote learning amongst stakeholders. However, there is perhaps more scope for scheduling points of joint reflection, progress updates and participatory learning (such as annual reviews).
● Building in emerging best practice: Since Kuunika began, there have been massive strides49 in the experience of digital health innovation, how to strategise it, do it and how to evaluate it. A lot has been learnt and shared since 2016/17 on best practice in all these areas.
● Keeping the formative research role distinct: The formative learning process (‘act fact, fail fast, learn fast’) should be explicitly wrapped into the design of the digital programme from the start, with programme documents and management processes maintained to work with, and capture, that approach.50
● Documentation of learning and course correction: While much of that formative learning is in real time, tacit and inherent in the programme and in the process of ‘learning by doing’, there is benefit in recording and disseminating not just what changes have happened and developments have occurred, but why, and what the learning process has been to get there.
● Incorporating an evaluation utilisation approach: The above step would mean that maintaining and reviewing the key programme documents could then be linked into a light touch, independent evaluation that employs more participative, self-assessment and scorecard approaches to assist and embed learning in the programme.
● Retaining some independence in the evaluation function is recommended, to provide not only verification and independent advice to the client, and the wider digital health community, but also help difficult issues to be surfaced and potentially difficult messages delivered.
● Jointly developing some key programme design tools. For example, the project theory of change and M&E framework (which could be linked practically to a ‘digital roadmap’ for digital transformation. The particular division of labour and usable nature of these tools would need to be carefully agreed and negotiated. Consideration would need to be given to
49 For example, the inaugural edition of the State of Digital Health report, presenting the Global Digital Health Index, was published in April 2019
50 The original project design for Kuunika planned for a two-part approach on learning and evaluation that reflected the expected need for support to detailed formative research, rapid learning and feedback to the implementing partners, plus an independent evaluator to take an overview of the programme and the extent to it had progressed towards its overall goal and impact. Later Cooper / Smith took the formative research role with it into the implementation role, rightly making closer links between rapid learning and implementation
joint tools for: a) the technical back-end of the system architecture; b) organisational functions, including understanding digital confidence and innovation readiness – all contingent determinants of delivery; c) the wider contextual governance and legislative determinants and needs, including stakeholder and political economy assessments, and behavioural change and innovation adoption scoring.
● Application of standard global concepts, indicators and tools in the evaluation that have growing currency in the field of digital transformation. Global concepts such as ‘innovation readiness’, ‘digital confidence’, ‘innovation adoption’ now are commonly defined and measurement methods and composite indices developed which could be adapted for low resource settings, such as Malawi. It might also be possible to draw on in-country frameworks and methods for evaluating digital innovation (including in the health sector). Application of ‘non-adoption, abandonment, scale-up, spread and sustainability’ frameworks could add value to sustainability evaluations.51Use of more standardised concepts and evaluation frameworks across digital health projects could assist comparison work and promote country-level peer-to-peer benchmarking and review.