Study: UnitedHealth follow-up care algorithm is systematically racially biased
Study: UnitedHealth follow-up care algorithm is systematically racially biased
Occurred: October 2019
Report incident 🔥 | Improve page 💁 | Access database 🔢
An algorithm used by UnitedHealth Group for follow-up care is systematically biased against Black patients, leading to significant disparities in healthcare access and treatment, researchers say.
Designed to predict patient health risks and allocate follow-up care, UnitedHealth Group's ALERT system has been found to classify white patients as being sicker than Black patients with similar health conditions.
This misclassification results in a lower percentage of Black patients receiving necessary care management services - only 18 percent of those identified as needing more care were Black, while the expected proportion should have been closer to 46 percent.
The underlying issue stems from the algorithm's reliance on historical healthcare spending as a proxy for health needs, which fails to account for the unequal access to care then experienced by Black patients.
Consequently, Black patients, who are often sicker but have had less healthcare expenditure historically, are overlooked by the algorithm.
For those impacted, bias of this type can lead to inadequate treatment for serious health conditions among Black patients, exacerbating health disparities and contributing to poorer health outcomes.
On a societal level, reliance on biased algorithms reinforces racial inequities in healthcare access and quality, potentially codifying discrimination within medical practices and insurance policies.
The findings highlight the need for reform in how algorithms are developed and implemented in healthcare settings to ensure equitable treatment for all patients.
ALERT 🔗
Operator:
Developer: UnitedHealth Group
Country: USA
Sector: Health
Purpose: Analyse clinical and claims data
Technology: Machine learning; Pattern recognition; Risk assessment algorithm
Issue: Bias/discrimination - race
Obermeyer Z. et al. Dissecting racial bias in an algorithm used to manage the health of populations
https://www.theguardian.com/society/2019/oct/25/healthcare-algorithm-racial-biases-optum
https://www.statnews.com/2019/10/24/widely-used-algorithm-hospitals-racial-bias/
https://www.healthcarefinancenews.com/news/study-finds-racial-bias-optum-algorithm
https://magazine.publichealth.jhu.edu/2023/rooting-out-ais-biases
Page info
Type: Incident
Published: January 2025