Innocent customer thrown out of supermarket after facial recognition alert
Innocent customer thrown out of supermarket after facial recognition alert
Occurred: January 2026
Page published: February 2026
A 42-year-old data professional was wrongfully ejected from a Sainsbury’s supermarket in London, UK, after staff misidentified him following a facial recognition alert, highlighting the severe personal and social risks of automated surveillance in public spaces.
Warren Rajah was approached by Sainsbury's staff and security who said he’d been flagged by a facial-recognition system and told to leave immediately, forcing him to abandon his shopping and exit the store in front of other customers without a clear explanation.
After contacting the facial recognition provider, Facewatch, Rajah was told he was not on their database and had not triggered any alert. Rather, the error arose from staff mistaking him for another person present in the store at the same time.
Sainsbury’s later apologised and offered him a small shopping voucher. Both companies described the issue as human, not technological, error.
Sainsbury's used the Facewatch facial recognition system to alert staff when someone on an offenders’ database entered the store; the system reportedly flagged the correct person, but staff then approached the wrong customer on the shop floor.
Both Sainsbury’s and Facewatch have described the incident as “human error” rather than a failure of the underlying software, highlighting how such systems still depend on hurried, fallible judgments by staff under pressure.
Rajah’s account suggests poor training and unclear procedures: he was given no meaningful explanation, no immediate way to challenge the decision, and was instead directed to a QR code and sign, which effectively shifted the burden onto him to prove his innocence.
For Warren Rajah: The experience was distressing and humiliating, leaving him feeling criminalised despite being innocent.
For the general public: It highlights broader social concerns about biometric surveillance in public spaces, including the potential for misidentification, inadequate safeguards, and the emotional and reputational harm that can result. Civil liberties advocates say such incidents can create a “chilling effect,” where ordinary customers feel they might be wrongly targeted by automated systems.
For policymakers: The incident underscores that high "algorithmic accuracy" (claimed at 99.98%) is irrelevant if the operational procedures are flawed. It highlights a need for stricter regulation on how private companies manage "watchlists" and a legally mandated, transparent process for challenging automated or semi-automated decisions.
Facewatch FR
Developer: Facewatch
Country: UK
Sector: Retail
Purpose: Identify criminal suspects
Technology: Facial recognition
Issue: Accountability; Autonomy; Transparency
September 2025. Sainsbury’s begins trialling Facewatch facial recognition in select UK stores.
January 27, 2026. Warren Rajah is confronted by three staff members and ejected from the Elephant and Castle store.
Late Jan-early Feb 2026. Rajah contacts Facewatch; he is forced to submit his passport and a photo to prove his innocence.
February 5, 2026. The incident gains national media attention.
February 6, 2026. Sainsbury’s issues a public apology, offers a £75 voucher, and announces "additional training" for store management.
February 2026. Civil liberties groups like Big Brother Watch use the case to call for a total ban on live facial recognition in retail.
AIAAIC Repository ID: AIAAIC2209