Amazon Rekognition falsely links athletes to mugshots

Occurred: October 2019

Can you improve this page?
Share your insights with us

Amazon’s Rekognition system falsely linked the faces of 27 professional athletes in New England, USA, to mugshots in a criminal database. 

The Massachusetts ACLU used Amazon's Reknognition facial recognition system to filter 188 well-known local athletes through a database of 20,000 mugshots, and found the product misidentified 27 of them. The result was verified by an independent industry expert.

Amazon retorted that the ACLU had been 'knowingly misusing and misrepresenting Amazon Rekognition to make headlines' and that Rekognition could help identify criminals and missing children when used with its recommended 99 percent confidence threshold.

The incident was seen to underscore issues with the accuracy and reliability of Rekognition, and to highlight its implications for civil rights and liberties, including racial bias and discrimination.

It also prompted rights activists and others to call for a moratorium on government use of facial recognition techniology, and for dedicated federal and local legislation.

Databank

Operator: American Civil Liberties Union (ACLU)  
Developer: Amazon/AWS
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Strengthen law enforcement
Technology: Facial recognition
Issue: Accuracy/reliability; Bias/discrimination - race; Human/civil rights
Transparency

System

Research, advocacy


News, commentary, analysis