Durham police rapped for 'crude' criminal reoffender profiling
Occurred: April 2018
Report incident 🔥 | Improve page 💁 | Access database 🔢
Durham's police force was criticised by privacy campaigners over the 'crude' nature of the data it was using to help predict which offenders were likely to commit more crimes.
An investigation by digital rights and privacy group Big Brother Watch (BBW) revealed that Durham Constabulary had augmented police data underpinning its Harm Assessment Risk Tool (HART) with Experian's Mosaic dataset.
The dataset classifed Britons into 66 groups such as 'disconnected youth' and 'Asian heritage' and were annotated with lifestyle details such as 'heavy TV viewers', 'overcrowded flats' and 'families with needs'.
Such categories were ‘really quite offensive and crude’, according to BBW's Silkie Carlo. Durham Constabulary later said it stopped including Mosaic in its dataset.
The Harm Risk Assessment Risk Tool scored offenders and placed them into three categories indicating they were at low, moderate or high-risk of reoffending. Those deemed to be at moderate risk of reoffending being offered the chance to go into a rehabilitation programme called Checkpoint as an 'alternative to prosecution.'
System 🤖
Harm Assessment Risk Tool (HART)
Operator: Durham Constabulary
Developer: Cambridge University; Durham Constabulary
Country: UK
Sector: Govt - police
Purpose: Predict criminal reoffenders
Technology: Prediction algorithm; Machine learning
Issue: Accuracy/reliability; Bias/discrimination; Human/civil rights; Privacy
Research, advocacy 🧮
EDRi (2020). Use cases: Impermissible AI and fundamental rights breaches (pdf)
Big Brother Watch (2018). POLICE USE EXPERIAN MARKETING DATA FOR AI CUSTODY DECISIONS
Big Brother Watch (2018). A CLOSER LOOK AT EXPERIAN BIG DATA AND ARTIFICIAL INTELLIGENCE IN DURHAM POLICE
Oswald M., Grace J. (2017). Algorithmic Risk Assessment Policing Models: Lessons from the Durham HART Model and ‘Experimental’ Proportionality
Page info
Type: Issue
Published: February 2024