Study finds Amazon Rekognition suffers from racial and gender bias

Occurred: January 2019

Can you improve this page?
Share your insights with us

An MIT Media Lab study concluded that Amazon's Rekognition facial recognition system performed worse when identifying an individual’s gender if they were female or darker-skinned.


The MIT researchers compared tools from five companies, including Microsoft and IBM, and found that Rekognition performed the worst when it came to recognising women with darker skin, with an error rate of 31.37 percent. It also mistook women for men 19 percent of the time.

Amazon claimed the research was misleading as the researchers had not tested the most recent version of Rekognition, and that the gender identification test was facial analysis (which spots expressions and characteristics like facial hair) rather than facial identification (which matches scanned faces to mugshots).

In their paper, the researchers also argued that issues other than algorithmic fairness should be considered. 'The potential for weaponization and abuse of facial analysis technologies cannot be ignored nor the threats to privacy or breaches of civil liberties diminished even as accuracy disparities decrease,' they wrote.

Databank

Operator: MIT Media Lab, Joy Buolamwini, Deborah Raji
Developer: Amazon/AWS
Country: USA
Sector: Education
Purpose: Identify individuals
Technology: Facial recognition
Issue: Bias/discrimination - racial, gender; Dual/multi-use
Transparency

System

Research, advocacy


News, commentary, analysis