AI in Policing (LVL 3)

AI tools are developed with the aim of preventing crime with tools such as computer vision, pattern recognition, and the use of historical data to create crime maps, locations with higher risks of offence. Whilst they may reduce on-the-fly human bias, they may automate systemic biases. For example, facial recognition techniques are less reliable for non-white individuals, specially for black women. 

Historical data may reflect the over-policing certain locations whilst under-policing others. Those patterns get encoded in the algorithms, which reinforce the over- and under-policing of the same areas in the future. The abundance of data can also make postcode a proxy for ethnicity.

Filter resources by type or complexity

How AI Could Reinforce Biases In The Criminal Justice System (LVL 4)

How AI Could Reinforce Biases In The Criminal Justice System (LVL 4)

Whilst some believe AI will increase police and sentencing objectivity, others fear it will exacerbate bias. For example, the over-policing of minority communities in the past has generated a disproportionate…

Unmasking Facial Recognition (LVL 4)

Unmasking Facial Recognition (LVL 4)

unmasking-facial-recognition-webroots-democracy.pdfResearchers used a pretrained off-the-shelf model from Nvidia A report exploring the racial bias challenges of the police’s use of live facial recognition technology in the United Kingdom. It focuses…

Facial recognition could stop terrorists before they act (LVL 4)

Facial recognition could stop terrorists before they act (LVL 4)

Facial recognition could stop terrorists before they actIn their zeal and earnest desire to protect individual privacy, policymakers run the risk of stifling innovation. The author makes the case that…

Is police use of face recognition now illegal in the UK? (LVL 4)

Is police use of face recognition now illegal in the UK? (LVL 4)

Is police use of face recognition now illegal in the UK?The UK Court of Appeal has determined that the use of a face-recognition system by South Wales Police was “unlawful”,…

UK police adopting facial recognition, predictive policing without public consultation (LVL 4)

UK police adopting facial recognition, predictive policing without public consultation (LVL 4)

UK police using facial recognition, predictive policing without public consultationUK police forces are largely adopting AI technologies, in particular facial recognition and predictive policing, without public consultation. This article alerts…

Data Analytics and Algorithmic Bias in Policing (LVL 4)

Data Analytics and Algorithmic Bias in Policing (LVL 4)

RUSI_Report_-_Algorithms_and_Bias_in_Policing.pdfResearchers used a pretrained off-the-shelf model from Nvidia Algorithms used for predictive policing rely on datasets inherently biased because of historically over- or under-policing certain communities. This results in the…

Decision Making in the Age of the Algorithm (LVL 4)

Decision Making in the Age of the Algorithm (LVL 4)

Decision-making in the Age of the AlgorithmA guide for public sector organisations on how to introduce AI tools so that they are embraced and used wisely. A good summary of…

Share your thoughts!