8
The advent of new technologies in the justice system
Automated decision making (ADM): ADM is the process of making a decision by automated means without any human involvement. Decisions can be made based on factual data or inferred data. An example is a system counting correct answers on a multiple choice test sheet and then attributing a pass or fail mark based on the number of correct answers. It could also include an online decision to award a loan or provide a quote for the cost of a service.5 Data analysis: combining data from a variety of sources and applying organisational, statistical, mathematical or machine learning calculations to retrieve data, find a pattern, produce a conclusion or make a prediction. Machine learning (ML): ML is a branch of AI that allows a system to learn and improve from examples without all its instructions being explicitly programmed. ML systems are provided with large volumes of different categories of data, and identify patterns that distinguish one category from another. Thus, they ‘learn’ to process future data, extrapolating its knowledge to unfamiliar situations. Applications of ML include virtual assistants (such as ‘Alexa’), recommender systems, and facial recognition.6 Technological solution: a method by which data, digital software tools, AI and/or new technologies can be used (fully or partially) to provide a service, provide information, undertake a task, make a decision or change the way something is done. The rapid deployment of new technologies 3.
When we began our inquiry, we were aware of several technological solutions used in the application of the law. We had heard of ‘predictive policing’ which we understood as the use of historic data to predict where and when certain types of crime may be more likely to occur, and using those predictions to plan policing priorities and strategies. We were also aware of visa streaming algorithms—processing tools used by visa-issuing authorities to triage applications and decide which should be investigated. The use of facial recognition was of concern to us as well, especially in relation to privacy rights and risks of discrimination which have been widely reported.7
4.
Written evidence from Dr Miri Zilka, Dr Adrian Weller and Detective Sergeant Laurence Cartwright laid out some categories of tools used in the justice system. While our scope is wider, most of the tools we heard of fit into these categories: (a)
5
6 7
Data infrastructure: software and tools primarily used to record, store, organise, and search data.
Information Commissioner’s Office, ‘What is automated individual decision-making and profiling?’: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protectionregulation-gdpr/automated-decision-making-and-profiling/what-is-automated-individual-decisionmaking-and-profiling/ [accessed 6 February 2022] The Royal Society, Machine learning: the power and promise of computers that learn by example (April 2017): https://royalsociety.org/~/media/policy/projects/machine-learning/publications/machine-learn ing-report.pdf [accessed 6 February 2022] ‘UK’s facial recognition technology ‘breaches privacy rights’’, The Guardian (23 June 2020): https:// www.theguardian.com/technology/2020/jun/23/uks-facial-recognition-technology-breaches-privacyrights [accessed 6 February 2022]. Also see Harvard University, The Graduate School of Arts and Sciences, ‘Racial Discrimination in Facial Recognition Technology’ (24 October 2020): https:// sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/ [accessed 6 February 2022].