2 minute read

1 Introduction

- The guidelines also depict several typical use cases and list numerous considerations relevant, especially with regard to the necessity and proportionality test (Annex III).

1. Facial recognition technology (FRT) may be used to automatically recognise individuals based on his/her face. FRT often is based on artificial intelligence such as machine learning technologies. Applications of FRT are increasingly tested and used in various areas, from individuals to business enterprises and public administration. Law enforcement authorities (LEAs) also expect advantages from the use of FRT. It promises solutions to relatively new challenges such as investigations of big data, but also to known problems, in particular with regard to under-staffing and observation and search measures.

Advertisement

2. A great deal of the increased interest in FRT is based on the efficiency and scalability of FRT. With these come the disadvantages inherent to the technology and its application – also on a large scale. While there may be thousands of personal data sets analysed at the push of a button, already slight effects of algorithmic discrimination or misidentification may create high numbers of individuals affected severely in their conduct and daily lives. The sheer size of processing of personal data, and in particular biometric data, possible is a further key element of FRT, as the processing of personal data constitutes an interference with the fundamental right to protection of personal data according to Article 8 of the Charter of Fundamental Rights of the European Union (the Charter). 3. The application of FRT of LEAs will – and to some extent already does – have significant implications on individuals and on groups of people, including minorities. These implications will also have considerable effects on the way we live together and on our social and democratic political stability, valuing the high significance of pluralism and political opposition. The right to protection of personal data often is key as a prerequisite to guarantee other fundamental rights. The application of FRT is considerably prone to interfere with fundamental rights beyond the right to protection of personal data.

4. The EDPB therefore deems it important to contribute to the ongoing integration of FRT in the area of law enforcement covered by the Law Enforcement Directive1 and provide the present guidelines. They are intended to provide relevant information to law makers at EU and national level, as well as for LEAs and their officers at implementing and using FRT-systems. Individuals that are interested generally or as data subjects may also find important information, in particular as regards data subjects’ rights. 5. The guidelines consist of the main document and three annexes. The main document at hand presents the technology and the legal framework applicable. To help identify some of the major aspects to classify the severity of the interference with fundamental rights to a given field of application, a template can be found in Annex I. LEAs that wish to procure and run a FRT-system may find practical guidance in Annex II. Depending on the field of application of FRT, different considerations are of relevance. A set of hypothetical scenarios and relevant considerations may be found in Annex III.

1 Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA.

This article is from: