UK prisoner risk categorisation algorithm poses racism risk

Occurred: November 2019

Can you improve this page?
Share your insights with us

A UK government algorithmic system for categorising prisoners in UK jails risks automating and embedding racism in the system, experts and advocacy groups warned.

Developed by Deloitte, the unnamed tool draws on data, including from the prison service, police and the National Crime Agency, to assess what type of prison a person should be put in and how strictly they should be controlled during their sentence. 

But critics argued the system could result in ethnic minority prisoners being unfairly placed in higher security conditions than white prisoners, exacerbating existing discrimination and meaning higher category prisoners would have fewer opportunities to develop skills and work towards rehabilitation compared with those held in open or lower security jails.

The MoJ refused to specify which intelligence systems the new tool used, claiming this information “would be likely to prejudice the maintenance of security and good order”.  


Operator: Her Majesty's Prison and Probation Service; Ministry of Justice
Developer: Deloitte
Country: UK
Sector:  Govt - justice
Purpose: Assess offender risk
Technology: Risk assessment algorithm
Issue: Bias/discrimination - race, ethnicity
Transparency: Governance; Black box