Machine Bias – There’s software used across the country to predict future criminals and it’s biased against blacks

This is an article detailing a software which is used to predict the likelihood of recurring criminality. It uses case studies to demonstrate the racial bias prevalent in the software used to predict the ‘risk’ of further crimes. Even for a similar crime, a white criminal would be much more likely to be judged low-risk. The algorithm used here is biased against blacks. Pro Publica put together a study of risk scores across the USA to obtain this.

Share this resource:

Share on facebook
Share on twitter
Share on linkedin
0 0 vote
Article Rating
Notify of

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Inline Feedbacks
View all comments
Would welcome your thoughts, please comment.x