Gladsaxe vulnerable children detection
Report incident 🔥 | Improve page 💁 | Access database 🔢
Gladsaxe was an algorithmic system used by Denmark's Gladsaxe municipality to identify and assess children at risk from abuse.
Released in 2018, the so-called 'Gladsaxe Model' consisted of custom algorithms drawing on parental health records, unemployment, missed medical and dental appointments and other data provided locally and by Denmark's Udbetaling Danmark benefits agency to produce a points-based risk assessment.
Mental illness counted for 3000 points, unemployment 500 points, missing a doctor’s appointment 1000 points and a dentist’s appointment 300 points.
System 🤖
Gladsaxe Model algorithm
Operator: Gladsaxe Municipality
Developer: Gladsaxe Municipality; Udbetaling Danmark
Country: Denmark
Sector: Govt - municipal
Purpose: Detect vulnerable children
Technology: Risk assessment algorithm
Issue: Accuracy/reliability; Privacy; Scope creep/normalisation
Transparency: Governance; Black box; Marketing
Risks and harms 🛑
Incidents and issues 🔥
Public criticism of the system's instrusiveness, scope creep, inaccuracy and unreliability forced Gladsaxe municpality to delay the roll-out of the system.
However, the authorities continued to develop and expand it with additional data, including household electricity use, until criticism from Denmark's data protection agency and a deepening political backlash led to the system's demise in 2019.
The system raised concerns about the role of algorithms in democratic societies, and the need for proper oversight and regulation.
Research, advocacy 🧮
News, commentary, analysis 🗞️
https://politiken.dk/viden/Tech/art7202917/Algoritmer-skal-udpege-langtidsledige
https://eticasfoundation.org/ghetto-plan-in-denmark-tracing-children-with-special-needs/
https://automatingsociety.algorithmwatch.org/report2020/denmark/
https://damijan.org/2018/12/26/umetna-inteligenca-in-spodkopavanje-telmeljev-demokracije/
Page info
Type: Incident
Published: December 2018
Last updated: June 2024