UK prisoner risk categorisation algorithm poses racism risk
Occurred: November 2019
Can you improve this page?
Share your insights with us
A UK government algorithmic system for categorising prisoners in UK jails risks automating and embedding racism in the system, experts and advocacy groups warned.
Developed by Deloitte, the unnamed tool draws on data, including from the prison service, police and the National Crime Agency, to assess what type of prison a person should be put in and how strictly they should be controlled during their sentence.
But critics argued the system could result in ethnic minority prisoners being unfairly placed in higher security conditions than white prisoners, exacerbating existing discrimination and meaning higher category prisoners would have fewer opportunities to develop skills and work towards rehabilitation compared with those held in open or lower security jails.
The MoJ refused to specify which intelligence systems the new tool used, claiming this information “would be likely to prejudice the maintenance of security and good order”.
Databank
Operator: Her Majesty's Prison and Probation Service; Ministry of Justice
Developer: Deloitte
Country: UK
Sector: Govt - justice
Purpose: Assess offender risk
Technology: Risk assessment algorithm
Issue: Bias/discrimination - race, ethnicity
Transparency: Governance; Black box
System
Research, advocacy
Zilka M., Sargeant H., Weller A. (2022). Transparency, Governance and Regulation of Algorithmic Tools Deployed in the Criminal Justice System: a UK Case Study
The Law Society (2019). Mapping algorithms in the justice system
The Police Foundation (2010). Intelligent Justice? (pdf)
Ministry of Justice (2007). Predicting and understanding risk of re-offending: the Prisoner Cohort Study (pdf)
News, commentary, analysis
Page info
Type: Issue
Published: February 2024