Allegheny child neglect screening system unfairly flags Black kids for investigation
Allegheny child neglect screening system unfairly flags Black kids for investigation
Occurred: April 2022
Report incident 🔥 | Improve page 💁 | Access database 🔢
An algorithm used to tell when kids are in danger of abuse or neglect was found to disproportionately flag Black children for “mandatory” investigations compared to white children.
Research from Carnegie Mellon University revealed (pdf) that Allegheny County's Family Screening Tool recommended investigating two-thirds of Black children reported, compared to about half of all other children, raising concerns about it's accuracy and tendency for bias.
The researchers also found (pdf) that social workers disagreed with the risk scores the algorithm produced about one-third of the time.
Allegheny County officials dismissed the research as 'hypothetical', based on old data, and noted that social workers can always override the tool, which was never intended to be used on its own.
The findings prompted questions about whether the tool improved decision-making or reduced racial disparities in child welfare investigations.
It also surfaced frustrations about the secrecy surrounding the algorithm and how risk scores are calculated - something considered important given the sensitivity of the tool and its impact on the lives of children and their families.
➕ June 2022. Oregon dropped its Allegheny County's Family Screening Tool-inspired Safety at Screening tool.
➕ January 2023. The AP reported that the Allegheny algorithm could harden bias against people with disabilities, including families with mental health issues.
Operator: Allegheny County Children and Youth Services
Developer: Rhema Vaithianathan; Emily Putnam-Hornstein; Irene de Haan; Marianne Bitler; Tim Maloney; Nan Jiang
Country: USA
Sector: Govt - welfare
Purpose: Predict child neglect/abuse
Technology: Prediction algorithm
Issue: Accuracy/reliability; Bias/discrimination - race, ethnicity
Stapleton L., Hao-Fei Cheng H-F., Kawakami A., Sivaraman V., Cheng Y., Qing D., Perer A., Holstein K., Steven Wu Z., Zhu H. (2022). Extended Analysis of “How Child Welfare Workers Reduce Racial Disparities in Algorithmic Decisions” (pdf)
AP (2023). Not magic: Opaque AI tool may flag parents with disabilities
AP (2022). Oregon dropping AI tool used in child abuse cases
AP (2022). An algorithm that screens for child neglect raises concerns
https://www.nytimes.com/2018/01/02/magazine/can-an-algorithm-tell-when-kids-are-in-danger.html
https://medicalxpress.com/news/2022-04-algorithm-screens-child-neglect.html
https://pulitzercenter.org/stories/algorithm-screens-child-neglect-raises-concerns
https://www.wired.com/story/excerpt-from-automating-inequality/
https://virginia-eubanks.com/2018/02/16/a-response-to-allegheny-county-dhs/
https://www.muckrock.com/news/archives/2019/jul/10/algorithms-family-screening-Pennsylvania/
Page info
Type: Issue
Published: July 2024