New Study Blames Algorithm For Racial Discrimination

This article examines a tool created by Optum, which was designed to identify high-risk patients with untreated chronic diseases, in order to redistribute medical resources to those who need them most. Research has shown this algorithm to be biased; it was less likely to admit black people than white people who were equally sick to care-improving programs. The algorithm utilised a surrogate measure – the cost of each patient’s past treatments, however, black patients historically receive less care than white patients for the same disease, and so this skewed the data in favour of white patients.