Met Police live facial recognition trials

Released: August 2016

Can you improve this page?
Share your insights with us

London's Metropolitan Police Service conducted a series of trials of live facial recognition technology across London between August 2016 and February 2019. Trials included the Notting Hill Carnival, Remembrance Sunday, Stratford Westfield shopping centre, and Romford.

The programme was plagued by criticism from civil and privacy rights advocates and technology and legal experts regarding the accuracy of the system, and complaints about inadequate transparency, accountability, and privacy protection.

False matches

A July 2019 University of Essex report (pdf) commissioned by the Met Police into its broader use of facial recognition found that only eight of 42 matches were verified as correct, meaning 81% of suspects identified by its system were innocent. 

The Met Police responded by saying it was 'extremely disappointed with the negative and unbalanced tone of th[e] report'. As Sky News reported, the force 'prefers to measure accuracy by comparing successful and unsuccessful matches with the total number of faces processed by the facial recognition system. According to this metric, the error rate was 0.1%.' 

Human rights law compliance

The University of Essex researchers questioned the legal basis on which the Met deployed facial recognition technology, finding it 'inadequate' in light of the police’s legal duties under human rights law. 

The report went on to suggest that it would be 'highly possible' the Met Police's usage of the system would be found unlawful if challenged in court.

Legal and ethical standards

A report (pdf) published in October 2022 by University of Cambridges Minderoo Centre for Technology and Democracy researchers found that the Met Police's trials suffered from inadequate transparency and accountability, poor privacy, and failed to meet minimum expected ethical and legal standards.

The researchers went on to argue that live facial recognition technology should be banned from use in streets, airports and any public spaces in the UK.

Operator: Metropolitan Police Service (MPS)
Developer: NEC

Country: UK

Sector: Govt - police

Purpose: Strengthen security

Technology: Facial recognition
Issue: Accuracy/reliability; Surveillance; Privacy; Bias/discrimination - race, ethnicity, gender 

Transparency: Governance; Privacy; Marketing