Danish child protection algorithm accused of age discrimination
Occurred: June 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
A Danish government AI-powered algorithmic system intended to help caseworkers identify children likely to be maltreated was flawed in many ways, including demonstrating age bias, calling into question its usefulness.
Developed to be used by caseworkers of the Danish Child Protective Services, the Decision Support system (DSS) is intended to assist social workers in assessing the growing number of notifications about childen in immediate danger of mailtreatment in a timely manner, and to provide consistent risk assessment for the children being referred to Child Protective Services.
However, IT University of Copenhagen researchers found that the algorithm "has significant methodological flaws, suffers from information leakage, relies on inappropriate proxy values for maltreatment assessment, generates inconsistent risk scores, and exhibits age-based discrimination."
Specifically, the researchers argued the system "scores otherwise identical cases completely differently just based on the age of the child. Age is a protected attribute and globally recognized as a ground for discrimination."
The researchers "strongly" advised against the use of this kind of algorithms in local government, municipal, and child protection settings, and called for rigorous evaluation of similar tools before their implementation.
System 🤖
Decision Support System (DSS)
Operator: Danish Child Protective Services
Developer:
Country: Denmark
Sector: Govt - welfare
Purpose: Assess child abuse risk
Technology: Prediction algorithm; Machine learning
Issue: Accuracy/reliability; Bias/discrimination
Transparency: Governance