This is an article detailing a software which is used to predict the likelihood of recurring criminality. It uses case studies to demonstrate the racial bias prevalent in the software used to predict the ‘risk’ of further crimes. Even for a similar crime, a white criminal would be much more likely to be judged low-risk. The algorithm used here is biased against blacks. Pro Publica put together a study of risk scores across the USA to obtain this.