MEP files lawsuit to release iBorderCtrl lie detection system documents

Member of the European Parliament Patrick Breyer took the European Commission to court to find out how an AI-powered EU-funded lie detection system was developed and tested.

Breyer had initially requested ethical assessments and other information from the EU Research Agency (REA) on the lie detection module of iBorderCtrl following a 2019 investigation by The Intercept which found that the technology was unreliable, having incorrectly identified four out of sixteen honest answers put to it as false. 

Breyer also pressed the Commission on whether the technology discriminates against certain groups of people, including people of colour, women, the elderly, children, and people with disabilities.

The MEP was refused access to the information requested on the grounds that releasing them could undermine public security and jeopardise 'commercial interests'. Breyer sued the European Commission to gain access to the documents, and to information on controversial trials conducted for the technology.

In December 2021, the EU’s Court of Justice (CJEU) ruled that the REA could not keep these documents completely secret, and that the ethical and legal evaluations of technologies for 'automated deception detection' or automated 'risk assessment' must be published, as long as they did not relate specifically to the iBorderCtrl project.

However, it also ruled that the examination of the ethical risks such as the risk of stigmatisation and false positives and the legal admissibility of the concrete iBorderCtrl technology, and reports on the results of the pilot project, should be kept secret in order to protect commercial interests.

In September 2023, the CJEU responded to a 2022 appeal lodged by Breyer that ‘the public interest in disclosure outweighs private commercial interests’ and that there should be transparency from the beginning of the research phase, by ruling that the commercial interests of the REA outweighed the public interest. 

The ruling prompted concerns about the extent to which commercial interests should be used to restrict access to information about surveillance technologies funded by taxpayers.

Incident databank

Operator: 
Developer: European Dynamics; Manchester Metropolitan University
Country: European Union
Sector: Govt - immigration
Purpose: Detect traveller lies
Technology: Behavioural analysis; Facial recognition; Emotion recognition
Issue: Accuracy/reliability; Bias/discrimination - race, ethnicity, gender, age, disability; Human/civil rights
Transparency: Governance; Legal

System 🤖


Legal, regulatory 👩🏼‍⚖️


Research, advocacy 🧮

Freedom of information requests 🔦

Investigations, assessments, audits 🧐


News, commentary, analysis 🗞️