4 minute read

Facial recognition

Next Article
Low2No alcohol

Low2No alcohol

TECHNOLOGY: INFORMATION COMMISSIONER’S OFFICE

Privacy watchdog flags concerns over facial recognition tech in stores

UK PRIVACY WATCHDOG, THE ICO, HAS HIGHLIGHTED ITS CONCERNS OVER THE USE OF FACIAL RECOGNITION TECHNOLOGY IN STORES AND OTHER PUBLIC PLACES. BY ELIZABETH DENHAM, UK INFORMATION COMMISSIONER

Facial recognition technology brings benefits that can make aspects of our lives easier, more efficient and more secure. The technology allows us to unlock our mobile phones, set up a bank account online, or go through passport control. But when the technology and its algorithms are used to scan people’s faces in real time and in more public contexts, the risks to people’s privacy increases.

I am deeply concerned about the potential for live facial recognition (LFR) technology to be used inappropriately, excessively or even recklessly. When sensitive personal data is collected on a mass scale without people’s knowledge, choice or control, the impacts could be significant.

We should be able to take our children to a leisure complex, visit a shopping centre or tour a city to see the sights without having our biometric data collected and analysed with every step we take.

Unlike CCTV, LFR and its algorithms can automatically identify who you are and infer sensitive details about you. It can be used to instantly profile you to serve up personalised adverts or match your image against known shoplifters as you do your weekly grocery shop.

In future, there’s the potential to overlay CCTV cameras with LFR, and even to combine it with social media data or other “big data” systems – LFR is supercharged CCTV.

It is not my role to endorse or ban a technology but, while this technology is developing and not widely deployed, we have an opportunity to ensure it does not expand without due regard for data protection. My full opinion piece is accessible here.

Data protection and people’s privacy must be at the heart of any decisions to deploy LFR and the law sets a high bar to justify the use of LFR and its algorithms in places where we shop, socialise or gather.

The opinion piece is rooted in law and informed in part by six ICO investigations into the use, testing or planned deployment of LFR systems, as well as our assessment of other proposals that organisations have sent to us. Uses we’ve seen included addressing public safety concerns and creating biometric profiles to target people with personalised advertising.

In future, there’s the potential to overlay CCTV cameras with LFR, and even to combine it with social media data or other ‘big data’ systems – LFR is supercharged CCTV

It is telling that none of the organisations involved in our completed investigations were able to fully justify the processing and, of those systems that went live, none of them were fully compliant with the requirements of data protection law. All of the organisations chose to stop, or not proceed with, the use of LFR.

With any new technology, building public trust and confidence in the way people’s information is used is crucial so the benefits derived from the technology can be fully realised.

In the US, people did not trust the technology. Some cities banned its use in certain contexts and some major companies have paused facial recognition services until there are clearer rules. Without trust, the benefits the technology may offer are lost.

And, if used properly, there may be benefits. LFR has the potential to do significant good – helping in an emergency search for a missing child, for example.

Organisations will need to demonstrate high standards of governance and accountability from the outset, including being able to justify that the use of LFR is fair, necessary and proportionate in each specific context in which it is deployed. They need to demonstrate that less intrusive techniques won’t work.

Organisations will also need to understand and assess the risks of using a potentially intrusive technology and its impact on people’s privacy and their lives. For example, how issues around accuracy and bias could lead to misidentification and the damage or detriment that comes with that.

My office will continue to focus on technologies that have the potential to be privacy invasive, working to support innovation while protecting the public. Where necessary we will tackle poor compliance with the law.

We will work with organisations to ensure that the use of LFR is lawful, and that a fair balance is struck between their own purposes and the interests and rights of the public. We will also engage with government, regulators and industry, as well as international colleagues to make sure data protection and innovation can continue to work hand in hand.

Elizabeth Denham was appointed UK Information Commissioner on 15 July 2016, having previously held the position of Information and Privacy Commissioner for British Columbia, Canada.

This article is from: