Oregon drops 'unfair' child abuse Safety at Screening tool

Occurred: June 2022

An algorithm used by Oregon’s Department of Human Services to help decide which families are investigated for child abuse and neglect by child protective services was scrapped. 

The decision comes after an AP review of Pennsylvania's Allegheny Family Screening Tool (SST) found it had flagged a disproportionate number of Black children for 'mandatory' neglect investigations. Allegheny's Safety at Screening (SAS) tool had inspired Oregon officials to develop their own system.

Oregon officials said use of its tool was halted to help reduce disparities concerning which families are investigated for child abuse and neglect by child protective services, and that it would be replaced by a new programme - the Structured Decision Making model.

A predictive risk tool developed (pdf) and used by hotline workers at Oregon's Department of Human Services (DHS) from late 2018 to help decide which families flagged for instances of child abuse and neglect should be investigated by social workers, SST was also criticised for inaccurately misidentifying individuals at risk and poor transparency, resulting in the possibility of stigmatisation and inappropriate interventions, and undermining trust in the state's criminal justice system. 

US Senator Ron Wyden had expressed concern about racial bias in the use of artificial intelligence tools in child protective services and welcomed this move by the Oregon Department of Human Services.

Operator: Oregon Department of Human Services (DHS)
Developer: Oregon Department of Human Services (DHS)

Country: USA

Sector: Govt - welfare 

Purpose: Predict child neglect/abuse

Technology: Prediction algorithm
Issue: Accuracy/reliability; Bias/discrimination - race, ethnicity

Transparency: Black box

Research, advocacy 🧮

Investigations, assessments, audits 🧐