Durham police rapped for 'crude' criminal reoffender profiling

Occurred: April 2018

Can you improve this page?
Share your insights with us

Durham's police force was criticised by privacy campaigners over the 'crude' nature of the data it was using to help predict which offenders were likely to commit more crimes.

An investigation by digital rights and privacy group Big Brother Watch (BBW) revealed that Durham Constabulary had augmented police data underpinning its Harm Assessment Risk Tool (HART) with Experian's Mosaic dataset The dataset classifed Britons into 66 groups such as 'disconnected youth' and 'Asian heritage' and were annotated with lifestyle details such as 'heavy TV viewers', 'overcrowded flats' and 'families with needs'.

Such categories were ‘really quite offensive and crude’, according to BBW's Silkie Carlo. Durham later said it stopped including Mosaic in its dataset.

The Harm Risk Assessment Risk Tool (Hart) scored offenders and placed them into three categories indicating they were at low, moderate or high-risk of reoffending. Those deemed to be at moderate risk of reoffending being offered the chance to go into a rehabilitation programme called Checkpoint as an 'alternative to prosecution.'

Databank

Operator: Durham Constabulary
Developer: Cambridge University; Durham Constabulary
Country: UK
Sector: Govt - police
Purpose: Predict criminal reoffenders
Technology: Prediction algorithm; Machine learning
Issue: Accuracy/reliability; Bias/discrimination; Human/civil rights
Transparency: Governance; Black box

Page info
Type: Issue
Published: February 2024