An algorithm widely used in US hospitals to allocate health care to patients has been systematically discriminating against black people, a sweeping analysis has found. The study concluded that the algorithm was less likely to refer black people than white people who were equally sick to programmes that aim to improve care for patients with […]
Data scientist Hiwot Tesfaye joins Greg for a conversation about the use of algorithms in healthcare and how models can introduce bias. They’ll discuss current examples of health care bias, who should be held responsible and how we can do better as an industry in the future. With a particular focus on the allocation of […]
AI is helping healthcare organisations determine care management programs and treatment plans – who gets what care – but these models and algorithms can be biased and introduce discrimination in the allocation or denial of care.
AI-powered systems are being designed to support medical activities ranging from patient diagnosis and triaging to drug pricing. But when AI systems are trained on misrepresentative data sets they stand to develop discriminatory biases. Three case studies are explored that demonstrate the potential for racial bias in medical AI, including how AI may be used […]
This article draws on recent medical research which shows how potentially biased models informing our health care systems have impacted COVID-19. These biased models could exacerbate the impact the COVID-19 pandemic is having on people of colour. AI is not intrinsically objective, and any biases inherent in these technologies could reinforce racial structural injustices within […]
This article discusses mobile apps that aid the self-diagnosis of skin conditions. The apps do intend to be inclusive of all skin types, however, the training data was revealed to contain only 3.5% of images for darker-skinned patients. Ninety per cent of the database was made up of people with fair skin, darker white skin, […]
AI technologies are being used to diagnose Alzheimer’s disease by assessing speech. This technology could aid early diagnosis of Alzheimer’s. However, it’s evident that the algorithms behind this technology are trained on and for a specific tone of voice, excluding people of colour from the benefits of this technology.
AI-powered systems are being designed to support medical activities ranging from patient diagnosis and triaging to drug pricing. But when AI systems are trained on misrepresentative data sets they stand to develop discriminatory biases. Three case studies are explored that demonstrate the potential for racial bias in medical AI, including Melanoma Diagnosis and Diagnosis using […]
This article explores how the pulse oximeter, a device used to test oxygen levels in blood for coronavirus patients, exhibits racial bias. Medical journals give evidence that pulse oximeters overestimated blood-oxygen saturation more frequently in black people than white.
An estimated 40 million people in the US alone have smartwatches or fitness trackers that can monitor heartbeats. However, some people of colour may be at risk of getting inaccurate readings. Heart rate trackers rely on technology that is designed for and tested on lighter-skinned individuals, meaning that the technology could be less reliable for […]