Queensland domestic violence predictive policing trial prompts concerns

Released: TBC

A trial of predictive policing software in Queensland, Australia, drew concerns that minority groups would be unfairly targeted. 

The Queensland Police Service (QPS) initiated a trial using artificial intelligence (AI) to predict the future risk posed by known domestic violence perpetrators. The AI system identified "high risk" perpetrators based on previous calls to an address, past criminal activity, and other police-held data. These individuals will be visited at home by police before domestic violence escalates, and before any crime has been committed.

However, the approach sparked controversy due to its potential for unintended consequences. Critics argued that the AI system could reinforce existing biases in the criminal justice system by creating an endless feedback loop between police and those members of the public who have the most contact with police.

QPS claimed it removed ethnicity and geographic location attributes before training the AI model. However, human and civil rights advocates expressed concerns that Aboriginal and Torres Strait Islander people would be unfairly targeted, as they are in real life.

This initiative raised questions about the role of police in preventing domestic violence incidents and the ethical implications of using AI in this context. While the aim of policing AI-based strategies is to prevent or reduce crime through an assessment of the risk of future offending, there are concerns that this approach may inadvertently create crime.

System 🤖


Operator: Queensland Police Service (QPS)
Developer: Queensland Police Service (QPS)
Country: Australia
Sector: Govt - police
Purpose: Identify high-risk domestic violence offenders
Technology: Prediction algorithm; Risk processing analysis
Issue: Bias/discrimination - race, ethnicity; Ethics
Transparency: Black box

Research, advocacy 🧮

Page info
Type: Issue
Published: February 2022
Last updated: June 2024