Occurred: June 2021
Report incident 🔥 | Improve page 💁 | Access database 🔢
A number of healthcare delivery and planning algorithms widely used across the US, notably The Emergency Nurses Association (ENA)'s Emergency Severity Index, actively reinforced existing racial and economic biases.
Researchers at The University of Chicago Booth School of Business' Center for Applied Artificial Intelligence assessed (pdf) algorithms that help emergency rooms triage patients and predict diabetes, amongst other uses.
The Emergency Nurses Association (ENA)'s Emergency Severity Index was found to underestimate the severity of Black and Hispanic peoples' problems, perpetuating inequitable treatment in areas in which they reside for at least a decade, STAT reported.
The finding was seen to highlight concerns about racial, ethnic and geographic bias in the assignment of ESI scores to patients.
Operator: Brigham and Women’s Hospital, Boston; Emergency Nurses Association (ENA)
Developer: Emergency Nurses Association (ENA)
Country: USA
Sector: Health
Purpose: Assess medical condition
Technology: Triage algorithm
Issue: Accuracy/reliability; Bias/discrimination - race, ethnicity, economyox
Page info
Type: Issue
Published: October 2021