Machine Bias – There’s software used across the country to predict future criminals and it’s biased against blacks

This is an article detailing a software which is used to predict the likelihood of recurring criminality. It uses case studies to demonstrate the racial bias prevalent in the software used to predict the ‘risk’ of further crimes. Even for a similar crime, a white criminal would be much more likely to be judged low-risk. The algorithm used here is biased against blacks. Pro Publica put together a study of risk scores across the USA to obtain this.

Share this resource:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.