
New Study Blames Algorithm For Racial Discrimination, Ignores Physician Bias
How did an algorithm discriminate against black patients? The answer is simple. It didn’t. The technology didn’t discriminate, doctors did.
This article examines a tool created by Optum, which was designed to identify high-risk patients with untreated chronic diseases, in order to redistribute medical resources to those who need them most. Research has shown this algorithm to be biased; it was less likely to admit black people than white people who were equally sick to care-improving programs. The algorithm utilised a surrogate measure – the cost of each patient’s past treatments, however, black patients historically receive less care than white patients for the same disease, and so this skewed the data in favour of white patients.