Trevis Williams wrongfully arrested due to NYPD facial recognition error
Trevis Williams wrongfully arrested due to NYPD facial recognition error
Occurred: April 2025
Page published: September 2025
A New York City man was wrongfully arrested and jailed due to a flawed police facial recognition match that mistakenly identified him as a suspect in a Manhattan indecent exposure case, despite physical mismatches and alibi evidence.
Trevis Williams was stopped and arrested by the New York Police Department (NYPD) after facial recognition technology flagged him as a match to a man accused of exposing himself in a Manhattan building, although Williams was actually 19 kilometers away in Brooklyn at the time.
The victim, relying on a photo lineup including Williams’ image, identified him despite large discrepancies in height and weight between Williams (6'2", 230 lbs) and the suspect (about 5'6", 160 lbs).
Williams was jailed for over two days. Charges were dropped several months later.
The wrongful arrest caused serious direct harm to Williams - including loss of liberty, stress, fear of wrongful sex offender registration, and damage to reputation - and raised broader concerns about racial bias, reliability, and accountability in police use of facial recognition technology, especially for people of colour.
Williams' arrest stemmed from the NYPD’s use of facial recognition technology on low-quality, grainy CCTV images, which algorithms matched to his mugshot from a prior unrelated arrest.
Despite warnings within the reports that the match was not probable cause, the NYPD relied heavily on the technology and victim’s misidentification, failing to verify his alibi through phone location data or employer contact.
This failure reflects insufficient oversight of surveillance tools, over-reliance on imperfect AI matches, and lack of adequate human verification before arrests.
The technology has a high error rate with uncontrolled images and shows racial biases due to training on predominantly white faces, worsening risks for Black individuals like Williams.
The wrongful arrest of Trevis Williams exemplifies the risks and societal implications of unchecked facial recognition technology in policing and criminal justice.
It also raised questions about the accuracy and reliability of the NYPD's facial recognition tool, and the transparency of its use.
For society, the incident highlights urgent calls to reassess the deployment of facial recognition technology in policing to prevent racial injustice, protect civil liberties, and ensure that AI tools do not replace critical human judgement.
Advocates urge bans or stricter regulations on such surveillance technologies until accountability, transparency and accuracy improve, to safeguard citizens’ rights and prevent further harms caused by wrongful arrests driven by flawed AI systems.
Facial recognition system
A facial recognition system is a technology potentially capable of matching a human face from a digital image or a video frame against a database of faces.
Source: Wikipedia 🔗
Developer:
Country: USA
Sector: Govt - police
Purpose: Identifiy criminal suspect
Technology: Facial recognition
Issue: Accountability; Accuracy/reliability; Human/civil rights; Transparency
https://www.nytimes.com/2025/08/26/nyregion/nypd-facial-recognition-dismissed-case.html
https://www.cbsnews.com/newyork/news/nyc-facial-recognition-technology-wrongful-arrest-indecent-exposure/
https://petapixel.com/2025/08/28/new-york-police-nypd-facial-recognition-tech-blunder-leads-to-62-man-jailed-for-crime-by-56-suspect/
https://abc7ny.com/post/man-falsely-jailed-nypds-facial-recognition-surveillance-tech-failed/17664671/
https://www.zmescience.com/science/news-science/nyc-man-was-jailed-for-days-because-of-a-blurry-cctv-image-and-a-faulty-ai-match/
Incident no: AIAAIC2026