Malaysia AI court sentencing system said to be inaccurate, unfair

Occurred: April 2022

Can you improve this page?
Share your insights with us

An AI-powered court sentencing system being tested in Malaysia was accused by lawyers of being inaccurate and resulting in biased and unfair sentences.

Developed by Sarawak Information Systems, the nationwide pilot aimed to make sentencing more consistent and help clear the backlog of cases clogging Malaysia's legal system, according to Reuters

However, senior lawyers say the 'opaque' system lacks a judge's ability to weigh up individual circumstances, or adapt to changing social mores, and should be withdrawn.

Furthermore, lawyers say that there was no proper consultation on the technology's use. Malaysia's Bar Council said it was 'not given guidelines at all' when courts in Kuala Lumpur started using the system mid-2021 for sentencing in 20 types of crimes.

Operator: Mahkamah Persekutuan Malaysia
Developer: Sarawak Information Systems (SAINS)
Country: Malaysia
Sector: Govt - justice
Purpose: Achieve greater sentencing consistency
Technology: Predictive statistical analysis
Issue: Accuracy/reliability; Fairness; Bias/discrimination - race, ethnicity
Transparency: Governance; Black box; Marketing

Page info
Type: Incident
Published: April 2022