5 minute read

2.2 Emergence of public health surveillance systems

Digital identification systems

The world’s two largest digital identification systems are ‘Huduma Namba’ in Kenya and ‘Aadhaar’ in India. Among other data, they involve the collection of fingerprints, retina and iris patterns, voice patterns, and other identifiers. They determine access to essential government services (such as voting, registering birth certificates and civil marriages, or paying taxes) or access to pensions and unemployment benefits. However, there is evidence that, ‘when trying to access public services through these systems, certain racial and ethnic minority groups in both countries find that they are excluded, while others face logistical barriers (...) that in effect can result in de facto exclusion from services to which they are entitled’55. Furthermore, people with disabilities have ‘experienced discrimination for not being able to provide fingerprint or iris scans’56 .

Advertisement

Apart from the expansion of existing surveillance systems, the COVID-19 crisis has also led to the unveiling of many high-tech tools specifically aimed at tackling the pandemic. The most prominent example is a rapid rollout of pandemic-related mobile applications used for contact tracing, quarantine enforcement, social distancing monitoring, or symptom tracking, sometimes combined with a health status code. Such smartphone apps have been introduced in at least 54 countries across the globe57. These technologies may offer benefits to policymakers, the medical community, and to society at large (for example, by supporting efforts to protect public health and manage the crisis), but their widespread application also carries significant implications for fundamental rights. In many instances, these tools have been developed with minimal protection against abuse (such as excessive use by law enforcement agencies for non-pandemicrelated purposes), without sufficient evidence to confirm their efficacy to protect public health or appropriate scrutiny into whether they are proportionate to counter-epidemic efforts58. Even though the use of mobile location data may reveal sensitive information about people’s identity, location, behaviour, associations, and activities, many developers have largely ignored principles of privacy-by-design, which would ensure that privacy considerations are built into a tool’s architecture and software. Apps are often closed sourced, centralised (sending unencrypted data to centralised government servers) and with insufficient cyber-security standards, allowing data to be shared with multiple institutions. This is particularly problematic in the case of repressive regimes, where human rights defenders, independent journalists, or opposition leaders are routinely targeted, as apps that generate sensitive health and social networking data about these individuals increase opportunities for abusive surveillance. Moreover, in some countries (Singapore, Ukraine, and Bahrain, for example) apps have been made mandatory, having a disproportionate and discriminatory impact on certain populations, particularly when non-digital alternatives are not provided (see Table 1). In other countries, such as China, India, and Turkey, COVIDrelated health status and contact-tracing mobile apps, even though not officially mandatory, have been made gatekeepers for access to essential public services, such as public transport and other public spaces, workplaces or shopping malls59 .

55 UN Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia and related intolerance, op. cit. par. 40. UN Special Rapporteur on extreme poverty and human rights, op. cit., par. 15. 56 UN Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia and related intolerance, ibidem. 57 Freedom House, ‘Freedom of the Net 2020 Report’, 2020, op. cit., p. 15. 58 Joint civil society statement, 'States use of digital surveillance technologies to fight pandemic must respect human rights', 2020. 59 Freedom House, ‘Freedom of the Net. 2020 Report’, 2020, op. cit., p. 15.

Table 1: Examples of human rights implications of mandatory pandemic-related apps

Country

Type of application Problem Explanation

Singapore Contacttracing app Discriminatory impact The app is obligatory for some categories of migrant workers who already faced discrimination, increasing the risk of further marginalisation of this group60 .

Ukraine Quarantineenforcement app Discriminatory impact Risk of exposure to lifethreatening situations The government required people crossing its borders to install the app to monitor compliance with selfisolation orders. This has particularly affected elderly people in the Donetsk region, where many are unable to download the app and are therefore denied entry to government-controlled territory, instead left stranded in an active conflict zone61 .

Bahrain Quarantineenforcement app Excessive punishment Individuals failing to comply with the obligation to use the mandatory app and wear the electronic wristband which comes with it face severe criminal sanctions (up to 26,000 USD fine and/or a minimum three-month jail term)62 .

Data-driven responses to the pandemic are not limited to mobile apps. In different parts of the world, they also include solutions such as digital permit systems for non-essential travel, both on public transport and in private vehicles (Russia63); expansion of state access to data stored by telecommunications companies (in at least 30 countries across the world64); and the aggregation of data on new public health platforms from different sources. The last two measures have been deployed in Ecuador, where the government has introduced laws enabling satellite tracking of people suspected of having COVID-19 to ensure that they are complying with isolation requirements, and has also established a platform aggregating location data, surveillance camera footage, and data from a symptom-checking app65 . Overall, the pandemic has likely led to the emergence of new ways of ‘digital social sorting, in which people are identified and assigned to certain categories based on their perceived health status or risk of catching the virus’66. It has also exposed the problem of public-private partnerships in the area of surveillance, as many governments have provided their pandemic-related technological solutions in collaboration with private companies that develop surveillance tools or process user data (such as telecom companies or internet service providers). This illustrates a wider trend of ‘outsourcing’ surveillance by states, often with very little transparency and in cooperation with companies ‘hiding’ behind confidentiality and trade secret exceptions67. When those partnerships are implemented without appropriate safeguards and public

60 Privacy International, 'Singapore contact tracing app made mandatory for migrant workers', 2020. 61 Human Rights Watch, 'Ukraine: Trapped in a War Zona for Lacking a Smartphone', 26 June 2020. 62 The National, 'Coronavirus: Bahrain to use electronic tags for people in quarantine', 5 April 2020. 63 Rudnitsky, J. and Khrennikov, I., ‘Moscow Tightens Lockdown With Digital Permits as Virus Spreads’, Bloomberg, 10 April 2020. 64 Ibid, p. 18. 65 Human Rights Watch, 'Ecuador: Privacy at Risk with COVID-19 Surveillance', 1 July 2020; Association for Progressive Communications and other NGOs, 'Ecuador: Surveillance technologies implemented to confront COVID-19 must not endanger human rights', 19 March 2020. 66 Freedom House, ‘Freedom of the Net 2020 Report’, 2020, op. cit. p. 14. 67 Privacy International, ‘Public-Private surveillance partnerships’, 2020.

This article is from: