
2 minute read
Equality
The right to a fair trial 23. We were concerned that, in some instances, the use of advanced tools at certain points of the criminal justice pipeline may impede an individual’s right to a fair trial: whether by a lack of awareness that they were being used, unreliable evidence, or an inability to understand and therefore challenge proceedings.
Dr Arianna Andreangeli thought that textual analysis tools could “adversely affect the ability of the investigated parties to have a reasonable opportunity to understand the charges made against them, to appreciate the assessment of the evidence made by the competition agencies and ultimately to build their own defence against these allegations.”51 Professor Elizabeth Joh,
Advertisement
Professor of Law at the University of California, referred to the multiplicity of technologies that could be used in building a case, saying:
“It is often very difficult for individual criminal defendants even to know what types of technologies might have been used in their particular case. Of course, that adds to the difficulty of raising challenges to a particular technology when you are not even sure what combination of licence plate reader data, facial recognition technology or predictive policing software might have led to the identification of you as a particular suspect.”52
24. One contributor also told us that algorithmic technologies could be used without the court being made aware, and that in some cases, evidence may have been subject to “manipulation”. David Spreadborough, a Forensic
Analyst, gave the example that algorithmic error correction could be built into CCTV systems, and that “parts of a vehicle, or a person, could be constructed artificially”,53 without the court being aware.
25. Another contributor suggested that the judiciary may feel compelled to cede to algorithmic suggestions, and that this would render judges “the long arm of the algorithm”.54 Solid understanding of where advanced technologies may appear, how they work, their weaknesses, and how their validity will be determined is therefore a critical safeguard for the right to a fair trial.
26. We see serious risks that an individual’s right to a fair trial could be
undermined by algorithmically manipulated evidence. We therefore favour precise documentation, evaluation by subject experts, and transparency where evidence may be subject to algorithmic manipulation.
Equality 27. Technologies were seen to be reproducing inequities. Where the data fed to a machine learning programme was biased, the “resulting predictions are likely to present inaccurate or biased depictions of criminal activity”, which Big Brother Watch thought likely to lead to “discriminatory policing interventions.”55 Liberty were similarly concerned that technologies which can be categorised as predictive policing “entrench pre-existing patterns of
51 Supplementary written evidence from Dr Arianna Andreangeli (nTL0039). For further consideration of bias in facial recognition, see para 161. 52 Q 44 (Professor Elizabeth Joh) 53 Written evidence from David Spreadborough (nTL0015) 54 Written evidence from Dr Kyriakos n. Kotsoglou (nTL0006) 55 Written evidence from Big Brother Watch (nTL0037)