King's Cross quietly uses live facial recognition to monitor citizens
Occurred: August 2019
Report incident 🔥 | Improve page 💁 | Access database 🔢
London's King's Cross development was discovered to be quietly using facial recognition to monitor tens of thousands of people moving daily within its site, prompting accusations of unethical behaviour from civil and privacy rights groups.
The fracas put London's Metropolitan Police Service in the spotlight for its covert involvement in the private scheme, and for making a number of seemingly misleading and contradictory communications.
One month before, the UK Parliament had told police forces to stop using facial recognition technology until a legal framework for its use was set up.
In September 2019, King's Cross, which had been working on introducing a new facial recognition system, halted its use of the technology.
Inadequate transparency, accountability, consent
King's Cross developer Argent and its partners, including London's Metropolitan Police Service and British Transport Police, were criticised for the fact that the system had been operating between 2016 and 2018 without the knowledge of the local community, workers, and the general public, and without the permission and oversight of the Mayor of London.
The operators were also dragged over the coals for covertly sharing images with each other, for failing to gain the consent of those being monitored, and for refusing to reveal how long they had been using the technology, how user data was protected, if and with whom data was being shared, or the legal basis for its use.
Misleading communications
Opaque and misleading communications about the use of facial recognition at King's Cross characterised the responses of those involved from the outset.
When news of the use first broke, Argent had said it had been using facial recognition to 'ensure public safety'. In a letter sent to London mayor Sadiq Khan dated August 14, Argent partner Robert Evans said they wanted facial recognition software to spot people on the site who had previously committed an offence there.
Furthermore, the Met Police and British Transport Police had denied any involvement in the programme. But the Met Police later admitted it had supplied seven images for a database used to carry out facial recognition scans. It had previously told the Mayor of London that it had not worked with 'any retailers or private sector organisations who use Live Facial Recognition'.
System 🤖
King's Cross Central Limited Partnership (2019). Updated Statement: Facial Recognition
Legal, regulatory 👩🏼⚖️
Metropolitan Police Service. Report to the Mayor of London (pdf)
Mayor of London (2019). Letter to King's Cross Central Limited Partnership (pdf)
Research, advocacy 🧮
Privacy International (2020). King's Cross case study
British Transport Police. Response to Big Brother Watch Freedom of Information request (pdf)