PredPol perpetuates racial, ethnic, income bias
Occurred: December 2021
An investigation by The Markup and Gizmodo has discovered potential bias issues with crime prediction system PredPol.
Analysis of a huge volume of crime predictions left on an unsecured server showed PredPol (now renamed Geolitica) often avoided White and middle-to-upper-income residents neighbourhoods, and targeted Black and Latino neighbourhoods.
The findings suggest PredPol technology is resulting in so-called 'feedback loops' in which lower-income, ethnic zones are treated as surveillance hotspots and lead to disproportionately higher numbers of arrests of minority populations.
Published: September 2023
Last updated: November 2023