3 minute read
Mixed method data collection
As far as possible the endline data collection methods replicated those used at baseline (with some adjustments to accommodate COVID-19 travel restrictions). The endline data collection methods used at endline are summarized below.29
● A knowledge, attitudes and practice (KAP) survey explored perceptions of health data quality, individuals’ use of data at work and organizational drivers of data use in decision-making. The survey questionnaire was the same as in 2017 and 2019, 30 with three additional questions referring to COVID-19 monitoring data. The survey was conducted face-to-face by the local survey company. The enumerators captured 223 full, unique, KAP survey responses. ● Stakeholder interviews included key informant interviews (KIIs) and, where feasible and appropriate, focus group discussions (FGDs). The KIIs were largely organized and conducted remotely from the U.K. using Zoom or Teams video-calling software.31 In total, by mid-October, we had completed 15 KIIs and 6 FGDs involving 25 participants.32 ● A Data Quality Audit (DQA) and User Assessment was used to objectively assess whether data quality in DHIS2 had improved over the life of the project. These assessments were informed by a key hypothesis in the original theory of change that use of digital data depends on users’ trust in the quality of the data. As in 2017, four dimensions of data quality (completeness, accuracy, availability, and timeliness) were assessed for three proxy indicators, purposively selected as important to HIV service delivery and planning. For this exercise, the sample period was the threemonth period from 1 January to 31 March 2021 (first quarter 2021). Data elements were collected for the following three indicators: – Indicator 1: Number of pregnant women newly testing HIV positive – Indicator 2: Number of HIV positive clients initiated on ART for the first time – Indicator 3: Number of nurses posted at the facility.
Notably, the DQA was the most modified data collection activity since the 2017 baseline. This was partly due to pandemic travel restrictions,33but was also to examine the effects of digital innovation (in line with Kuunika objectives). All relevant paper forms for the data elements from each of the facilities were photographed by the enumeration team at the same time the KAP surveys and uploaded to a shared online folder where they could be viewed remotely.34These were then compared with what could be found on the HMIS/DHIS2 system which was fully accessible online.
A new component introduced to the DQA in 2021 was the assessment of the DHIS2 user interface, to assess the development of the look and feel of the system to a ‘semi-expert’ user. 35 For this exercise, we applied the user assessment to both the HMIS and the One Health Surveillance Platform (OHSP) – with the latter included to reflect Kuunika’s pivot to COVID-19 monitoring since early 2020.
In addition, we used the Google Analytics system use metrics for DHIS2 supplied by Kuunika since 2017 to look at trends in use over the last five years.
29 Since the data collection instruments are sizeable documents they have not been included in this report; however, they are available from Mott
MacDonald upon request. 30 For the 2019 and 2021 rounds we used the Kobotoolbox.com software. 31 Special Study 1 was authored by Dr Mwakilama who is based in Lilongwe and was able to conduct interviews face to face or by telephone. 32 In addition, over 40 KIIs were conducted for the special studies – these have informed our synthesis analysis. 33 The facility-level qualitative systems assessment interviews which were a feature of the 2017 DQA were omitted because COVID-19 restrictions prevented travel by the international evaluation team. 34 See Table 4 for some limitations of this approach. 35 The assessment was conducted by a Mott MacDonald epidemiologist, who is expert in looking for and understanding health data but a novice in
Malawi’s HMIS and the DHIS2 system.