South Wales Police facial recognition

Released: May 2017

Can you improve this page?
Share your insights with us

A trial of facial recognition technology at football and rugby matches, music festivals, and on city streets by South Wales Police (SWP) from May 2017 to April 2019 met with significant criticism from civil and privacy rights advocates. 

South Wales Police's trial consisted of mobile video cameras hooked up to facial recognition software to scan crowds for faces on a watchlist. It's use of the technology was ruled unlawful by the UK Court of Appeal in August 2020.

False matches

The system was accused of making unacceptably high levels of errors. Over 2,000 people were wrongly identified as possible criminals by South Wales Police's facial recognition system during the 2017 Champions League final in Cardiff. Of the 2,470 potential matches with custody pictures, 2,297 (92%) were wrong.

The force said the high volume of false matches was down to 'poor quality images' supplied by agencies including UEFA and Interpol. They also argued it could be attributed to it being the first major use of the equipment. A later evaluation of the system by Cardiff University found it flagged 2,900 possible suspects, but 2,755 were false matches.


Ed Bridges legal challenge

Campaign group Liberty brought a legal case against South Wales police after Cardiff resident and civil rights activist Ed Bridges claimed the force had invaded his privacy and data protection rights by capturing and processing his facial features whilst he was shopping and when he was attending a defence industry exhibition.


Bridges lost his first challenge, with the two judges ruling the technology was lawful. However, he won on Appeal, with the court finding that there had been inadequate guidance on where the system could be used and who could be put on a watchlist, its data protection impact assessment was deficient, and the force had failed to take reasonable steps to find out if the software contained racial or gender bias.


Legal and ethical standards

An October 2022 report (pdf) by researchers at University of Cambridge's Minderoo Centre for Technology and Democracy found that the South Wales Police trial had failed to meet minimum expected ethical and legal standards

The report singled out the force's failure to establish limits on the use of facial recognition technology at protest assemblies, inadequate oversight, concerns about the independence of the Joint Independent Ethics Committee and the lack of human rights, equality, or data protection experts on the committee, and the lack of public consultation, notably with marginalised communities.

Operator: South Wales Police
Developer: NEC

Country: UK

Sector: Govt - police

Purpose: Strengthen security

Technology: Facial recognition
Issue: Accuracy/reliability; Bias/discrimination - race, ethnicity, gender; Ethics; Freedom of expression - right of assembly; Privacy

Transparency: Governance; Privacy

System

Legal, regulatory

Research, advocacy

Investigations, assessments, audits

News, commentary, analysis

Page info
Type: System
Published: March 2023
Last updated: November 2023