US Veterans Affairs suicide prevention algorithm favours white men

Occurred: May 2024

A US Department of Veterans Affairs (VA) AI programme called REACH VET that prevents suicide amongst military veterans has been found to prioritise white men and overlook survivors of sexual violence.

According to a Markup investigation, the algorithm considers 61 variables and gives preference to veterans who are “divorced and male” and “widowed and male”, but does not prioritise female veterans. Notably, it does not take into account military sexual trauma and intimate partner violence, both of which are linked to elevated suicide risk among female veterans.

Recent government data shows a 24 percent rise in the suicide rate among female veterans between 2020 and 2021, which is four times the increase among male veterans during the same period. Despite this, the VA’s suicide prevention algorithm continued to prioritise men.

The VA claims that its machine learning model, REACH VET, flags 6,700 veterans a month for extra help and results in a significant reduction in suicide attempts by those veterans over the following six months

However, the algorithm’s bias towards white men and its neglect of survivors of sexual violence raised concerns about its effectiveness and fairness.

System 🤖

Documents 📃

Operator:
Developer: US Department of Veterans Affairs
Country: USA
Sector: Govt - welfare; Govt - defence
Purpose: Prevent suicide
Technology: Machine learning
Issue: Accuracy/reliability; Bias/discrimination - race, ethnicity, gender
Transparency