King's Cross live facial recognition

Released: 2016
Occurred: August 2019

Can you improve this page?
Share your insights with us

The August 2019 discovery by the Financial Times that the King's Cross development in London had been quietly using facial recognition to monitor tens of thousands of people moving daily within its site infuriated civil and privacy rights groups, which accused it of unethical behaviour. 

The fracas also put London's Metropolitan Police Service in the spotlight for its covert involvement in the private scheme, and for making a number of seemingly misleading and contradictory communications.

One month before, the UK Parliament had told police forces to stop using facial recognition technology until a legal framework for its use was set up. In September 2019, King's Cross, which had been working on introducing a new facial recognition system, halted its use of the technology.

Inadequate transparency, accountability, consent

King's Cross developer Argent and its partners, including London's Metropolitan Police Service and British Transport Police, were roundly criticised for the fact that the system had been operating between 2016 and 2018 without the knowledge of the local community, workers, and the general public, and without the permission and oversight of the Mayor of London. 

The operators were also dragged over the coals for covertly sharing images with each other, for failing to gain the consent of those being monitored, and for refusing to reveal how long they had been using the technology, how user data was protected, if and with whom data was being shared, or the legal basis for its use.

Misleading communications

Opaque and misleading communications about the use of facial recognition at King's Cross characterised the responses of those involved from the start.

When news first broke, Argent had said it had been using facial recognition to 'ensure public safety'. In a letter sent to London mayor Sadiq Khan dated August 14, Argent partner Robert Evans said they wanted facial recognition software to spot people on the site who had previously committed an offence there.

Furthermore, the Met Police and British Transport Police had denied any involvement in the programme. But the Met Police later admitted it had supplied seven images for a database used to carry out facial recognition scans. It had previously told the Mayor of London that it had not worked with 'any retailers or private sector organisations who use Live Facial Recognition'.

Operator: Argent; Metropolitan Police Service (MPS)
Developer: NEC

Country: UK

Sector: Govt - police

Purpose: Strengthen security

Technology: Facial recognition
Issue: Ethics; Governance; Privacy; Surveillance

Transparency: Governance; Marketing; Privacy

Page info
Type: Incident
Published: March 2023