King's Cross quietly uses live facial recognition to monitor citizens

Occurred: August 2019

London's King's Cross development was discovered to be quietly using facial recognition to monitor tens of thousands of people moving daily within its site, prompting accusations of unethical behaviour from civil and privacy rights groups.

The fracas put London's Metropolitan Police Service in the spotlight for its covert involvement in the private scheme, and for making a number of seemingly misleading and contradictory communications.

One month before, the UK Parliament had told police forces to stop using facial recognition technology until a legal framework for its use was set up. 

In September 2019, King's Cross, which had been working on introducing a new facial recognition system, halted its use of the technology.

Inadequate transparency, accountability, consent

King's Cross developer Argent and its partners, including London's Metropolitan Police Service and British Transport Police, were criticised for the fact that the system had been operating between 2016 and 2018 without the knowledge of the local community, workers, and the general public, and without the permission and oversight of the Mayor of London. 

The operators were also dragged over the coals for covertly sharing images with each other, for failing to gain the consent of those being monitored, and for refusing to reveal how long they had been using the technology, how user data was protected, if and with whom data was being shared, or the legal basis for its use.

Misleading communications

Opaque and misleading communications about the use of facial recognition at King's Cross characterised the responses of those involved from the outset.

When news of the use first broke, Argent had said it had been using facial recognition to 'ensure public safety'. In a letter sent to London mayor Sadiq Khan dated August 14, Argent partner Robert Evans said they wanted facial recognition software to spot people on the site who had previously committed an offence there.

Furthermore, the Met Police and British Transport Police had denied any involvement in the programme. But the Met Police later admitted it had supplied seven images for a database used to carry out facial recognition scans. It had previously told the Mayor of London that it had not worked with 'any retailers or private sector organisations who use Live Facial Recognition'.

System 🤖

Operator: Argent; Metropolitan Police Service (MPS)
Developer: NEC

Country: UK

Sector: Govt - police

Purpose: Strengthen security

Technology: Facial recognition
Issue: Ethics; Governance; Privacy; Surveillance

Transparency: Governance; Marketing; Privacy

Legal, regulatory 👩🏼‍⚖️

Research, advocacy 🧮

Investigations, assessments, audits 🧐

Page info
Type: Issue
Published: March 2023