PredPol perpetuates racial, ethnic, income bias
An investigation by The Markup and Gizmodo has discovered potential bias issues with crime prediction system PredPol.
Analysis of a huge volume of crime predictions left on an unsecured server showed PredPol (now renamed Geolitica) often avoided White and middle-to-upper-income residents neighbourhoods, and targeted Black and Latino neighbourhoods.
The findings suggest PredPol technology is resulting in so-called 'feedback loops' in which lower-income, ethnic zones are treated as surveillance hotspots and lead to disproportionately higher numbers of arrests of minority populations.
System
Investigations, assessments, audits
The MarkUp (2021). Crime Prediction Software Promised to Be Free of Biases. New Data Shows It Perpetuates Them
The MarkUp (2021). How We Determined Crime Prediction Software Disproportionately Targeted Low-Income, Black, and Latino Neighborhoods
The MarkUp (2021). PredPol investigation data
News, commentary, analysis
Page info
Type: Incident
Published: September 2023
Last updated: November 2023