PredPol perpetuates racial, ethnic, income bias

Occurred: December 2021

Report incident ๐Ÿ”ฅ | Improve page ๐Ÿ’ | Access database ๐Ÿ”ข

An investigation by The Markup and Gizmodo discovered potential bias issues with crime prediction system PredPol.

Analysis of a huge volume of crime predictions left on an unsecured server showed PredPol (now renamed Geolitica) often avoided White and middle-to-upper-income residents neighbourhoods, and targeted Black and Latino neighbourhoods.ย 

The findings suggest PredPol technology is resulting in so-called 'feedback loops' in which lower-income, ethnic zones are treated as surveillance hotspots and lead to disproportionately higher numbers of arrests of minority populations.

Operator: Los Angeles Police Department
Developer: Geolitica/PredPol

Country: USA

Sector: Govt - police

Purpose: Predict crime

Technology: Behavioural analysis
Issue: Accuracy/reliability; Bias/discrimination - race, ethnicity, income

Transparency: Governance; Black box

Page info
Type: Incident
Published: September 2023
Last updated: November 2023