British Bangladeshi man wrongfully arrested for theft after facial recognition error
British Bangladeshi man wrongfully arrested for theft after facial recognition error
Occurred: January 2026
Page published: February 2026
Report incident๐ฅ| Improve page ๐| Access database ๐ข
A 26-year-old Asian man in the UK was wrongfully arrested and detained for nearly ten hours due to a facial recognition algorithm that misidentified him as a burglary suspect, highlighting the real-world harms of unregulated biometric technologies and the urgent need for stronger oversight.
Software engineer Alvi Choudhury was arrested at his home in Southampton, UK, in front of his neighbours and parents, and held in custody for nearly 10 hours for a GBP 3,000 residential burglary in Milton Keynes, 115 miles away.
However, Choudhury had never visited Milton Keynes, had a sound alibi, and cleary differed to the real suspect, who appeared roughly ten years younger and had different facial features, including no facial hair.
The incident caused real emotional distress to Choudhury and his family, damaged his reputation in his neighbourhood, and potentially jeopardised his career.
Choudhury is suing the police for wrongful arrest.
CCTV footage of the theft run through retrospective facial recognition software used by Thames Valley Police suggested Choudhury as a possible match.ย
However, the software, supplied under a Home Office procurement from German company Cognitec, wrongly flagged Choudhury, an image of whom was already on a national police database due to a prior mistaken arrest in 2021.
The police claimed a "human visual assessment" confirmed the match. But Choudhury noted that officers later admitted he looked nothing like the CCTV footage, suggesting that the "automation bias" led officers to trust the software over their own eyes.
Questions have also been raised about the accuracy of the system, especially for Asian faces. The police reportedly admitted that the arrest โmay have been the result of bias within facial recognition technology.โย
A UK government assessment of Cognitecโs software revealed that the police have been using outdated algorithms from 2020, and that the specific settings used by police are 100 times more likely to misidentify Asian subjects (4.0% error rate) compared to white subjects (0.04%).
For Choudhury, the incident is not only about a single traumatic arrest but about being placed in a cycle of suspicion: he fears that because his mugshot remains on police systems, any future crime committed by โa brown personโ could again place him under suspicion. The public arrest outside his home, the time in custody, and the disruption to his work may also damage his reputation and, he worries, his chances of obtaining government security clearances for his career.
For the general public, the case undercuts trust in both facial recognition tools and the police forces deploying them, especially among ethnic minority communities who already experience disproportionate policing. It illustrates how AI-powered surveillance technologies can cuase real harm without robust safeguards, error handling, or oversight.
For policymakers, the case intensifies pressure to regulate or restrict facial recognition in policing and retail, including stronger accuracy and bias testing, clearer legal limits on its use as evidence, and mandatory reporting of errors and wrongful arrests. It also raises questions about data retention and whether people who have been cleared, or who were never charged, should have their images stored in massive mugshot databases used by these systems.ย
FaceVACS DBScan ID
Developer: Cognitec
Country: UK
Sector: Personal
Purpose: Identify criminal suspects
Technology: Facial recognition
Issue: Accountability; Accuracy/reliability;ย Automation bias; Autonomy; Fairness; Privacy; Transparency
Police and Criminal Evidence Act 1984
National Physical Laboratory. FACIAL RECOGNITION TECHNOLOGIES: ACCURACY AND EQUITABILITY EVALUATION OF COGNITEC FACEVACS-DBSCAN ID V5
AIAAIC Repository ID: AIAAIC2224