Humana accused of using AI to deny health insurance

Occurred: December 2023

Can you improve this page?
Share your insights with us

A class-action lawsuit accused US healthcare insurer Humana of wrongfully using an AI model to deny elderly people key rehabilitation care.

In November 2021, one of the plaintiffs, 86-year-old JoAnne Barrows, was discharged to a rehabilitation facility after being hospitalised following a fall. She was under a non-weight-bearing order for six weeks due to a leg injury, according to the lawsuit. 

Humana informed Barrows that it would cancel her coverage after just two weeks in the rehabilitation facility, according to the lawsuit, though she was to be non-weight-bearing for an additional month. She appealed the decision but was denied, forcing her family to pay out-of-pocket for the rehab she needed, according to the suit.

Humana responded by saying it uses 'various tools, including augmented intelligence to expedite and approve utilization management requests,' and that the company 'maintains a 'human in the loop' decision-making whenever AI is utilized.'

A November 2023 lawsuit filed against UnitedHealth alleged that nH Predict naviHealth's has a '90% error rate', and that the insurer continued to use it as few members tend to appeal claims denials. 

Databank

Operator: Humana Inc
Developer: UnitedHealth Group; Cardinal Health; SeniorMetrix
Country: USA
Sector: Health
Purpose: Predict post-acute care needs
Technology: Prediction algorithm
Issue: Accuracy/reliability; Accountability
Transparency: Governance; Black box; Complaints/appeals