Eric Loomis algorithmic risk assessment accused of denying due process

Occurred: 2016

The use of a controversial algorithmic recidivism tool to sentence Eric Loomis to six years in prison for driving a car used in shooting was criticised and appealed as a denial of due process. 

Eric Loomis was sentenced to six years in prison for driving a car used in a shooting by a judge partly relying on the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) tool. 

The ruling led to Loomis lodging a legal appeal on the basis that the  score assigned to him could not be assessed due to its protection as a trade secret, and that the use of COMPAS infringed on his right to an individualised sentence and his right to be sentenced on the basis of accurate information. 

He also argued that the court unconstitutionally considered gender at sentencing by relying on a risk assessment that took gender into account.

COMPAS risk assessments are based on data gathered from a defendant's criminal file and from an interview with the defendant, and predict the risk of pretrial recidivism, general recidivism, and violent recidivism.

A registered sex offender, Loomis scored high on all three risk measures, and was sentenced to six years in prison. The judge said he had arrived at his sentencing decision in part because of Mr. Loomis’s rating on the Compas assessment. Loomis then filed a motion requesting a new sentencing hearing, which was denied by all three levels of Wisconsin courts.

Nonetheless, the court advised that judges presented with COMPAS risk scores understood that: 

The judge also ruled that the use of COMPAS did not deny due process as the court ultimately imposes sentences on the basis of all of its knowledge of the defendant, including their criminal histories.

The incident raised questions about the opacity, accountability, fairness and efficacy of the tool and of tools similar to it, such as the Ontario Domestic Assault Risk Assessment (ODARA) and the Virginia Non-violent Risk Assessment (NVRA).

It also prompted debate about the use of big data and algorithms to predict crime (aka 'predictive policing').

Operator: Wisconsin Court System
Developer: Volaris Group/Equivant/Northpointe

Country: USA

Sector: Govt - justice

Purpose: Assess recidivism risk

Technology: Recidivism risk assessment system
Issue: Accountability; Bias/discrimination - race, ethnicity; Human/civil rights

Transparency: Governance; Black box

Legal, regulatory 👩🏼‍⚖️

Research, advocacy 🧮

Page info
Type: Incident
Published: March 2023
Last updated: March 2024