1 minute read
Figure 1: Predictive policing—a vicious circle?
discrimination”,56 and Professor nigel Harvey and Tobias Harvey wrote that “learning algorithms based on historical data would preserve bias”.57
28. We do not have sufficient information to draw firm conclusions about the kind of crimes that are most heavily policed with algorithmic technology, but draw attention to a reflection from Professor Karen Yeung, Interdisciplinary
Advertisement
Professorial Fellow in Law, Ethics and Informatics at the University of
Birmingham, which suggests a concerning tendency:
“We are not building criminal risk assessment tools to identify insider trading or who is going to commit the next kind of corporate fraud because we are not looking for those kinds of crimes, and we do not have high-volume data. This is really pernicious. We are looking at high-volume data that is mostly about poor people, and we are turning it into prediction tools about poor people. We are leaving whole swathes of society untouched by those tools.”58
29. In a similar vein, the logical consequence of ‘predictive policing’ tools, which indicate where crime is likely to occur, is that police officers patrol those areas.
Liberty argued that areas identified in this way were likely to be “subject to over-policing (that is, a level of policing that is disproportionate to the level of crime that actually exist[s])”.59 Due to increased police presence, it is likely that a higher proportion of the crimes that are committed in those areas will be detected than in those areas which are not over-policed. The data will reflect this increased detection rate as an increased crime rate, which will be fed into the tool and embed itself into the next set of predictions: a vicious circle.
56 Written evidence from Liberty (nTL0020) 57 Written evidence from Professor nigel Harvey and Tobias Harvey (nTL0025) 58 Q 60 (Professor Karen Yeung) 59 Written evidence from Liberty (nTL0020)