Murdered Spanish woman Lobna Hehmid wrongly assessed by domestic gender violence algorithm

Occurred: January 2022

A Spanish woman who reported being physically abused by her husband to the authorities was murdered by him despite being classified as "low risk" by an algorithm designed to assess the risk of domestic violence.

What happened

Despite Lobna Hehmid providing evidence of regular attacks by her husband to Spanish authorities, the country's VioGén algorithm classified the risk to her safety as low. 

Seven weeks later, Lobna was violently stabbed and murdered by her husband in their apartment outside Madrid. Her killer then took his own life. 

The New York Times discovered that a judicial review of 98 women killed by their current or former partner since 2007 after being assessed by VioGén found that at least 55 were scored by VioGén as negligible or low risk for repeat abuse.

The NYT also discovered that victims rarely knew about the role VioGén played in their cases, and that the government had not published comprehensive data about the system’s effectiveness and refused to make the algorithm available for outside audit.

Why it happened

The VioGén algorithm is seen to have failed to accurately assess the risks to Lobna Hemid and others killed due to factors including: 

What it means

The deaths of Lobna Hehmid and others raises major concerns that other victims of domestic violence may be at risk due to similar misclassifications.

It also raises concerns about over-reliance on algorithmic systems in law enforcement and victim protection, and highlights the need for better human oversight, improved algorithmic design, and more comprehensive risk assessment procedures.

System 🤖

Operator: Ministry of the Interior; Spanish National Police
Developer: Ministry of the Interior; SAS
Country: Spain
Sector: Govt - police
Purpose: Assess domestic violence risk
Technology: Risk assessment algorithm
Issue: Accountability; Accuracy/reliability; Safety; Transparency

Investigations, assessments, audits 👁️