Study: Facial recognition software misidentifies dark-skinned women 35 percent of the time 

Occurred: February 2028-

Facial recognition technology exhibits significant gender and racial bias, with error rates up to 35 percent for dark-skinned women compared to less than 1 percent for light-skinned men, according to researchers.

What happened

MIT and Stanford University researchers Joy Buolamwini and Timnit Gebru conducted a study on commercial facial-analysis programmes, revealing large disparities in accuracy across different demographic groups.

They found that error rates for gender classification were consistently higher for females than males, and for darker-skinned subjects than lighter-skinned ones. 

For darker-skinned women, error rates reached as high as 34.7 percent, while for light-skinned men, they never exceeded 0.8 percent.

Why it happened

Bias in facial recognition algorithms stems from inadequate diversity in the training datasets, with many datasets not sufficiently diverse to enable the algorithms to learn to correctly identify non-white faces. 

The problem is rooted in the historical underrepresentation of people of color and women in largescale datasets used to train and benchmark these systems. 

Additionally, there has been a widespread lack of attention paid by model designers to measuring performance disparities for historically marginalised groups.

What it means

Bias in facial recognition technology are seen to perpetuate and amplify existing racial and gender inequalities, with a real potential impact on law enforcement, immigration and other sectors, where misidentification can lead to wrongful arrests, lengthy detentions, and even deadly police violence.

System 🤖

Operator: 
Developer: IBM; Megvii; Microsoft
Country: USA
Sector: Govt - police; Govt - immigration
Purpose: Identify identity
Technology: Facial recognition
Issue: Accuracy/reliability; Bias/discrimination