15 minute read
Facial recognition technology the law: Are existing privacy &
FACIAL RECOGNITION TECHNOLOGY AND THE LAW: ARE EXISTING PRIVACY AND SURVEILLANCE LAWS FIT FOR PURPOSE?
CAITLIN SURMAN, SENIOR ASSOCIATE, HWL EBSWORTH
Advertisement
Over the past few years, the development and use of Facial Recognition Technology (FRT) throughout Australia has grown exponentially but has been accompanied by widespread concerns about the capacity of existing legislative frameworks to regulate it appropriately, as well as a lack of specific legislation regulating its use.
While lawmakers grapple with what that new legislative framework might look like, this article considers how Australia’s existing privacy and surveillance laws deal with FRT, including whether those laws adequately safeguard the use of FRT, and options for future reforms to these frameworks.
WHAT IS FRT AND HOW IS IT USED?
FRT involves the automated extraction, digitisation and comparison of spatial and geometric distribution of facial features. Using an algorithm, FRT compares an image of a face with an image stored in a database, in order to identify a match.1
FRT is deployed in two main ways, being: 1. ‘one-to-one’ FRT, which is used to verify the identity of an individual by checking one image against a single, respective image to determine if they are the same person.2 It is often utilised in a controlled environment where the lighting is sufficient and the subject is in an optimal position to facilitate a successful comparison,3 and its most common application is unlocking a smartphone; 2. ‘one-to-many’, which is used to identify an unknown individual by comparing a select image against a large database;4
This article focuses on ‘one-to-many’ FRT, which seeks to match a single facial image with a different facial image of the same individual that has been stored in a large database. It therefore relies on a much larger dataset to conduct a comparison, whilst the facial image being compared against the dataset is often taken from ‘the wild’ (eg CCTV surveillance) and is of lower quality.5 As a result, identifying a person using ‘one-to-many’ FRT is more difficult and prone to false matches and misidentification.6
In Australia, FRT is often used by banks and telecommunications companies for identity verification purposes, 7 and is used extensively by immigration authorities to verify the identity of passport holders at international borders/airports, as well as by law enforcement agencies throughout Australia for crime prevention and suspect identification purposes. Locally, SAPOL fully implemented its own FRT system (called ‘NEC NeoFace system’) in the Adelaide CBD in 2019, which integrates FRT with CCTV, ATM, and some social media footage.8 In November 2021, the Adelaide City Council announced plans to roll out an updated City Safe CCTV Network that will involve the introduction of facial and number plate recognition.9
EXISTING SURVEILLANCE LAWS
Application to FRT
There is no Commonwealth legislation that regulates the use of surveillance devices.10 Instead, this is currently governed by state and territory legislation. The relevant piece of legislation in South Australia is the Surveillance Devices Act 2016 (SA) (SDA).
The SDA prohibits:
1. the knowing installation, use or maintenance of an ‘optical surveillance device’11 by a person on a ‘premises’12 that visually records or observes a ‘private activity’ without the express or implied consent of all the key parties;13 and 2. the knowing use, communication or publication of information or material derived from the use of an optical surveillance device.14
The regulation of an optical surveillance device under the SDA is linked to the concept of a ‘private activity’, meaning an activity carried on in circumstances that may reasonably be taken to indicate that one or all of the parties do not want the activity to be observed by others.15 Accordingly, the SDA might prohibit FRT in circumstances where it is used for covert optical surveillance (unless an exception applies).
The definition of ‘private activity’ excludes activities carried on in a public place.16 Accordingly, public authorities can use devices with FRT to monitor the activities of the general public in public spaces, or semi-public spaces, without breaching the SDA.
Even if a person or government authority is prohibited from using a device to monitor FRT by the SDA, section 5(4) of the SDA sets out several exceptions to the general rule. These exceptions include where the use of the optical surveillance device is reasonably necessary for the protection of the ‘lawful interests’ of that person, or if the use of the device is in connection with the execution of a ‘surveillance device warrant’ or ‘surveillance device (emergency) authority’.
At Inter Intra, we are at war with business disruption. We act as your sentinel by providing transparent IT support through managed services, giving you peace of mind to focus on future-proofi ng and growing your business. Your business is only as good as the IT infrastructure that supports it. Set your business up with the right technology foundations to guarantee success and prosperity. • Years of experience supporting the legal sector with their IT infrastructure needs, and line of business applications.
• Essential 8 Cyber benchmarking • IT Managed Services
• Trusted local IT partner, for many
SA based companies
Are you ready to start your IT Support journey?
Running your business is enough of a challenge these days. Don’t let managing your IT infrastructure become a burden. At Inter Intra, we set your business up with the right technology foundations to guarantee success in the future.
Give us a call today for a free consultation.
Phone
1300 080 000 (+61) 1300 080 000 (International inquires)
Address
Level 17 45 Grenfell Street, Adelaide 5000
The term ‘lawful interest’ is not defi ned by the SDA but the concept was given judicial consideration in Nanosecond Corporation Pty Ltd v Glen Carron Pty Ltd (2018) 132 SASR 63 (Nanosecond) where Doyle J held that the recording of a private conversation ‘just in case’ it might prove advantageous in future civil litigation is not enough for the purpose of establishing a lawful interest. The Court is more likely to fi nd that a recording has been made in the protection of a person’s lawful interests where the conversation relates to an allegation of a serious crime or resisting such an allegation, or where a dispute has ‘crystallised into a real and identifi able concern about the imminent potential for signifi cant harm to the commercial or legal interests of a person.17 Whilst Nanosecond concerned the use of a listening device, the same principles arguably apply to the recording of a private activity via an optical surveillance device with FRT.
A further exception is contained in section 6(2) of the SDA, which provides that the prohibition on the use of an optical surveillance device does not apply if the use of the device is in the ‘public interest’. The term ‘public interest’ is not defi ned by the SDA.18
EXISTING PRIVACY LAWS
Application to FRT
Although the thirteen Australian Privacy Principles (APPs) in Schedule 1 to the Privacy Act 1988 (Cth) (Privacy Act) are intended to be technology neutral so as to preserve their relevance and applicability to changing technologies, 19 questions remain as to whether the APPs and Privacy Act suffi ciently protect privacy where FRT is deployed.
Australian privacy law treats biometric information as personal information.20 In particular, ‘Biometric information’ that is to be used for the purpose of ‘automated biometric verifi cation’ or ‘biometric identifi cation’, or ‘biometric templates’, is a type of ‘sensitive information’ for the purposes of the Privacy Act 1988 (Cth) and Australian Privacy Principles.21
‘Biometric information’ is not defi ned by the Privacy Act or APPs, but it is generally regarded as being information that relates to a person’s physiological or biological characteristics that are persistent and unique to the individual (including their facial features, iris or hand geometry),22 and which can therefore be used to validate their identity.23
The terms ‘automated biometric verifi cation’ or ‘biometric identifi cation’ are not defi ned by the Privacy Act or the APPs either. However, the Biometrics Institute defi nes ‘biometrics’ as encompassing a variety of technologies in which unique attributes of people are used for identifi cation and authentication,24 while the OAIC (Offi ce of the Australian Information Commissioner) has indicated (in effect) that a technology will be ‘automated’ if it is based on an algorithm developed through machine learning technology.25
A ‘biometric template’ is a mathematical or digital representation of an individual’s biometric information.26 Machine learning algorithms then use the biometric template to match it with other biometric information for verifi cation or identifi cation purposes.27
Given the breadth of the defi nitions of ‘biometric information’, ‘automatic biometric verifi cation’, ‘biometric identifi cation’ and ‘biometric template’, the majority of biometric information captured by FRT is likely to fall within the protections of the Privacy Act and APPs, and the safeguards contained in Privacy Act and APPs will therefore apply to any biometric information collected by any FRT deployed by an ‘APP entity’.28
Current Safeguards
As a form of ‘sensitive information’, biometric information is afforded a higher level of privacy protection under the Privacy Act and APPs than other personal information in recognition that its mishandling can have adverse consequences for an individual,29 meaning that an APP entity that collects and uses a person’s biometric information via FRT must adhere to stricter requirements. Consent
The key requirements are contained in APP 3, which (in effect) provides that an APP entity may only solicit and collect a person’s biometric information if the information is reasonably necessary for one or more of the APP entity’s functions
or activities,30 the biometric information has been collected by ‘lawful and fair means’, 31 and the person consents to the collection of their biometric information (unless an exception applies).32
Consent for the purpose of the Privacy Act and APPs can be either ‘express consent’ or ‘implied consent’.33 As a general rule, an APP entity should seek express consent to the collection of sensitive information (including biometric information) as the potential privacy impact is greater.34 In either case, however, an individual must be adequately informed before giving consent.35
The Privacy Act and APPs contain five exceptions to the requirement for an APP entity to obtain a person’s consent prior to collecting sensitive information (including biometric information).36 The exceptions are broad and include: 1. where it is unreasonable or impracticable to obtain a person’s consent to the collection, and the APP entity reasonably believes the collection is necessary to lessen or prevent a serious threat to the life, health or safety of any individual, or to public health or safety;37 2. where the APP entity has reason to suspect that unlawful activity, or misconduct of a serious nature, that relates to the APP entity’s functions or activities has been, is being, or may be engaged in and reasonably believes that the collection is necessary in order for the entity to take appropriate action in relation to the matter; and 38 3. where an ‘enforcement body’39 reasonably believes that collecting the information is reasonably necessary for, or directly related to, one or more of the body’s functions or activities.40 Use & Disclosure of Biometric information
As a type of sensitive information, special requirements also apply to the use and disclosure of biometric information after it has been collected via FRT. APP6 provides that an APP entity can only use or disclose biometric information for the original/primary purpose for which it was collected. For example, if a company collects the image of a person’s face for the purpose of unlocking their smartphone, the company would not (without consent) be permitted to use the individual’s face for an unrelated purpose, such as to build a database of people whose information could then be sold to a third party for marketing purposes.41
Biometric information can only be used or disclosed for a secondary purpose if an exception contained in APP 6.1 applies. Those exceptions include where the individual has consented to that secondary use or disclosure,42 or where an individual would ‘reasonably expect’43 the entity to use or disclose the information for that secondary purpose and the secondary purpose is directly related44 to the primary purpose of collection. There are also specific exceptions which enable an APP entity to share a person’s personal information (including their biometric information) with enforcement bodies.45
CONCERNS WITH EXISTING LAWS
Concerns with surveillance laws
Given how broad the legislated exceptions are, concerns have arisen that relying on these exceptions to justify the use of devices integrating FRT disproportionately affects a person’s privacy. The decision in Nanosecond curtails any such invasion to a limited extent by ensuring that the ‘lawful interest’ exception cannot be relied on to use FRT to visually monitor a person in anticipation that they might do something that might impinge upon a person’s lawful interest. However, more clear statutory limits as to what constitutes a ‘lawful interest’ would be helpful while the case law evolves.
Similarly, a key concern raised in respect of FRT and the public interest exception is that its widespread use in public places is not necessary or proportionate to a goal of crime prevention or public safety, and that the use of FRT therefore improperly invades a person’s privacy.46 Options to prevent any unnecessary incursions on a person’s privacy could include to require that the optical surveillance be ‘reasonably necessary’ to protect the public interest, and to introduce a list of non-exclusive statutory considerations that must be taken into account when undertaking that assessment.
Concerns with privacy laws
Scope
The Privacy Act and APPs are federal laws that only apply to organisations and agencies deploying FRT that fall within the definition of an ‘APP entity’. The definition of an ‘APP entity’ does not include state and territory authorities or agencies, or organisations with an annual turnover of less than $3 million.47 Whilst some jurisdictions have their own specific privacy legislation that steps in to help safeguard a person’s privacy where FRT is used, there are other jurisdictions where no specific privacy legislation exists at all (including South Australia).
In South Australia, the State public sector is required to comply with South Australian Information Privacy Principles (IPPs).48 However, the IPPs do not extend to biometric information, and there is no other legal framework which holds those agencies, authorities and organisations that fall outside the scope of the Privacy Act and APPs to account in SA.
No true consent
In the past year, the OAIC has issued two rulings in which it determined that the collection of biometric information by two separate companies (Clearview AI49 and 7Eleven50) contravened the consent requirements of the Privacy Act and APPs, demonstrating that whilst the OAIC is conscious of the privacy issues posed by FRT, the consent model under the current privacy regime is ill-equipped for FRT.
The Privacy Act and APPs strictly require that APP entities collecting biometric information via FRT should obtain express consent, but the nature of FRT means that it is not practical (or often possible) to obtain true, express consent from individuals whose biometric information might be captured by FRT. Whilst obtaining express consent is arguably more realistic where ‘one-toone’ FRT is being utilised for a specific purpose in a controlled environment, it is hard to imagine a scenario where an APP entity deploying ‘one-to-many’ FRT would (or could) take steps to obtain express consent from every person whose biometric information they might capture. Accordingly, an APP entity that deploys FRT will usually need to infer a person’s consent to the collection of their biometric information by FRT.
Even though inferred consent is an option, it is difficult for APP entities deploying FRT to provide people with enough information about how FRT collects and uses their biometric information before FRT captures their image. This means that most people captured by FRT will not have been properly informed about what they were
consenting to. Further, an individual will not often have the ability to refuse to provide their consent to the use of FRT, and may feel compelled to provide it due to the inconvenience of not doing so, or due to their lack of bargaining power. For example, although 7Eleven displayed a notice at the entrance to its stores to alert customers that they would be subject to FRT when they entered the store, 51 and sought to a infer that any customer who then chose to enter the store has provided consent, it is arguable that the customer had no choice (particularly if there were no convenient alternatives available to them). Breadth of exceptions
Another criticism levelled at the Privacy Act and APPs is that the exemptions to the consent requirements of APP 3, and the single purpose requirement of APP6, are too broad and do not suffi ciently protect people against invasions of privacy. The exemptions provided for in the Privacy Act which allow for the collection and use/disclosure of sensitive information (including biometric information) without consent have been made on the basis of balancing individual interests against those of collective security.52 However, this balancing approach has arguably resulted in individual privacy being ‘traded off’ against the wider community interests of preventing, detecting and prosecuting crime’.53
WHERE TO FROM HERE?
The issues identifi ed in this article suggest a review and assessment of existing privacy and surveillance laws is needed to address the unique challenges posed by biometric technologies. It is clear that while existing privacy and surveillance laws place a number of safeguards on the use of FRT in private enterprise, there is a gap in the regulation of the use of FRT by government authorities (particularly in South Australia). This is particularly concerning when FRT is used by government authorities to make decisions that might infringe on an individual’s human rights in the context of policing and law enforcement.
In March, 2021, the Australian Humans Rights Commission released the Human Rights and Technology Final Report 2021, which made a number of recommendations for the regulation of FRT, including the introduction of tailored legislation that regulates the use of FRT, and the introduction of a statutory cause of action for serious invasions of privacy.54 These recommendations have been made at the same time that the privacy law regime in Australia is undergoing a comprehensive review. Accordingly, it is hoped that those reviews can result in the incorporation of additional, more tailored safeguards to help balance the benefi ts fl owing from the use of FRT against its risks to personal privacy. B
Auctioneers & Valuers
MGS (SA) is South Australia’s most experienced industrial auctioneers and valuers with over 40 years in the industry. Our expertise is second to none. Servicing Corporate Australia, Insolvency Practitioners, Legal Professionals, Accountants and Government. Jack Ruby’s Bar Providing an unparalleled solution for asset management, valuations or disposal. Basement, 89 King William Street, Adelaide SA