Gladsaxe vulnerable children detection
Report incident 🔥 | Improve page 💁 | Access database 🔢
Gladsaxe was a predictive analytics system used by Denmark's Gladsaxe municipality to identify and assess children at risk from abuse.
Released in 2018, the so-called "Gladsaxe Mode" or "EVA" (Early Vulnerability Assessment) consisted of custom algorithms drawing on parental health records, unemployment, missed medical and dental appointments and other data provided locally and by Denmark's Udbetaling Danmark benefits agency to produce a points-based risk assessment.
Mental illness counted for 3000 points, unemployment 500 points, missing a doctor’s appointment 1000 points and a dentist’s appointment 300 points.
System 🤖
Website:
Operator: Gladsaxe Municipality
Developer: Gladsaxe Municipality; Udbetaling Danmark
Country: Denmark
Sector: Govt - municipal
Purpose: Detect vulnerable children
Type: Risk assessment algorithm
Technology:
Documents 📃
Transparency and accountability 🙈
The Gladsaxe system raised concerns regarding transparency and accountability.
Decision-making process. The system's algorithms and decision-making processes were not fully transparent, making it difficult to understand how risk scores are calculated and which factors contribute to a child being flagged as vulnerable.
Black box. The Gladsaxe system was proprietary and its inner workings not publicly accessible - a lack of transparency that hindered external scrutiny and made it challenging to identify potential biases or errors.
User communication. The system's outputs, such as risk scores and alerts, are not always shared with parents or guardians, which can lead to a lack of understanding and potential mistrust.
Inadequate oversight. There is limited external oversight and monitoring of the system's performance, leading to a lack of accountability for errors or biases.
Complaints and appeals. There were limited opportunities for individuals to appeal or contest the system's decisions.
Risks and harms 🛑
The Gladsaxe model algorithm is seen to have posed risks of privacy invasion, misidentification and potential bias, leading to unwarranted interventions and stigmatisation.
Incidents and issues 🔥
Public criticism of the system's instrusiveness, scope creep, inaccuracy and unreliability forced Gladsaxe municpality to delay the roll-out of the system.
However, the authorities continued to develop and expand it with additional data, including household electricity use, until criticism from Denmark's data protection agency and a deepening political backlash led to the system's demise in 2019.
The system raised concerns about the role of algorithms in democratic societies, and the need for proper oversight and regulation.
Research, advocacy 🧮
News, commentary, analysis 🗞️
https://politiken.dk/viden/Tech/art7202917/Algoritmer-skal-udpege-langtidsledige
https://eticasfoundation.org/ghetto-plan-in-denmark-tracing-children-with-special-needs/
https://automatingsociety.algorithmwatch.org/report2020/denmark/
https://damijan.org/2018/12/26/umetna-inteligenca-in-spodkopavanje-telmeljev-demokracije/
Page info
Type: Incident
Published: December 2018
Last updated: June 2024