AI tools are developed with the aim of preventing crime with tools such as computer vision, pattern recognition, and the use of historical data to create crime maps, locations with higher risks of offence. Whilst they may reduce on-the-fly human bias, they may automate systemic biases. For example, facial recognition techniques are less reliable for non-white individuals, specially for black women.
Historical data may reflect the over-policing certain locations whilst under-policing others. Those patterns get encoded in the algorithms, which reinforce the over- and under-policing of the same areas in the future. The abundance of data can also make postcode a proxy for ethnicity.
This video is an in depth panel discussion of the issues uncovered in the ‘Unmasking Facial Recognition’ report from WebRootsDemocracy. This report found that facial recognition technology use is likely…
Can make-up be an anti-surveillance tool?As protests against police brutality and in support of the Black Lives Matter movement continue in the wake of George Floyd’s killing, protection against mass…
Machine BiasThere’s software used across the country to predict future criminals. And it’s biased against blacks. This is an article detailing a software which is used to predict the likelihood…
Another Arrest, and Jail Time, Due to a Bad Facial Recognition MatchA New Jersey man was accused of shoplifting and trying to hit an officer with a car. He is…
Whilst some believe AI will increase police and sentencing objectivity, others fear it will exacerbate bias. For example, the over-policing of minority communities in the past has generated a disproportionate…
Unmasking facial recognitionFacial recognition is not the next generation of CCTV. Whilst CCTV takes pictures, facial recognition takes measurements. Measurements of the distance between your eyes, the length of your…
Facial recognition could stop terrorists before they actIn their zeal and earnest desire to protect individual privacy, policymakers run the risk of stifling innovation. The author makes the case that…
Is police use of face recognition now illegal in the UK?The UK Court of Appeal has determined that the use of a face-recognition system by South Wales Police was “unlawful”,…
UK police using facial recognition, predictive policing without public consultationUK police forces are largely adopting AI technologies, in particular facial recognition and predictive policing, without public consultation. This article alerts…
A report into algorithms and bias in policingThis paper summarises the use of analytics and algorithms for policing within England and Wales, and explores different types of bias that can…