Occurred: 2013
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
The sentencing of Paul Zilly sparked controversy over algorithmic fairness in criminal justice when a judge doubled his prison time based on an algorithmic risk assessment tool's prediction.
Paul Zilly was convicted of stealing a lawnmower and tools in Wisconsin, USA.ย
Despite a plea deal recommending one year in county jail, Judge James Babler rejected it after viewing Zilly's COMPAS risk scores, which rated him as high risk for future violent crime and medium risk for general recidivism.ย
The judge increased Zilly's sentence to two years in state prison, citing the risk assessment as "about as bad as it could be."
The incident occurred due to:
Overreliance on AI. Judge Babler heavily weighted the COMPAS algorithm's prediction, despite its intended use for correctional authorities rather than sentencing.
Algorithmic bias. COMPAS and similar tools have been found to exhibit racial biases, often unfairly flagging African American defendants as high risk compared to white defendants.
Lack of transparency. The proprietary nature of COMPAS prevented scrutiny of its algorithms, making it impossible to verify the integrity of its risk scores.
The fracas is seen to highlight the potential for algorithmic systems to exacerbate existing biases in the criminal justice system, and underscored the need for greater transparency, accountabilit and fairness when these system are used in criminal justice.
It also prompted discussion about the appropriate role of algorithms in legal proceedings and the importance of human judgement in sentencing decisions.
Equivant ๐
Operator: Wisconsin Court System
Developer: Volaris Group/Equivant/Northpointe
Country: USA
Sector: Govt - justice
Purpose: Assess recidivism risk
Technology: Recidivism risk assessment system
Issue: Accountability; Accuracy/reliability; Bias/discrimination; Fairness; Transparency
Rowe E.A., Prior N. (2022). Procuring Algorithmic Transparency (pdf)
Collins E. (2018). Punishing Risk (pdf)
Carlson A.M (2017). The Need for Transparency in the Age of Predictive Sentencing Algorithms (pdf)
ProPublica (2016). Machine Bias
https://www.sciencefocus.com/future-technology/can-an-algorithm-deliver-justice/
http://archive.jsonline.com/news/crime/risk-scores-attached-to-defendants-unreliable-racially-biased-b99732973z1-381306991.html
https://www.theatlantic.com/ideas/archive/2019/06/should-we-be-afraid-of-ai-in-the-criminal-justice-system/592084/
https://boingboing.net/2016/05/24/algorithmic-risk-assessment-h.html
https://theamericanscholar.org/the-future-of-silicon-valley/
https://digital.hbs.edu/platform-rctom/submission/man-or-machine-who-holds-the-stronger-suit-in-the-courtroom/
https://ilr.law.uiowa.edu/print/volume-103-issue-1/the-need-for-transparency-in-the-age-of-predictive-sentencing-algorithms/
https://www.cdotrends.com/story/17205/pleading-your-case-ai-judge
https://www.criminallegalnews.org/news/2020/jan/21/compute-or-not-compute-algorithm-driven-ai-criminal-justice-system/
https://www.theatlantic.com/ideas/archive/2019/06/should-we-be-afraid-of-ai-in-the-criminal-justice-system/592084/
https://urbanmilwaukee.com/2017/01/05/murphys-law-justice-system-uses-biased-test/
Page info
Type: Incident
Published: March 2023
Last updated: February 2025