Study: Facial recognition software misidentifies dark-skinned women 35 percent of the time
Study: Facial recognition software misidentifies dark-skinned women 35 percent of the time
Occurred: February 2028-
Report incident 🔥 | Improve page 💁 | Access database 🔢
Facial recognition technology exhibits significant gender and racial bias, with error rates up to 35 percent for dark-skinned women compared to less than 1 percent for light-skinned men, according to researchers.
MIT and Stanford University researchers Joy Buolamwini and Timnit Gebru conducted a study on commercial facial-analysis programmes, revealing large disparities in accuracy across different demographic groups.
They found that error rates for gender classification were consistently higher for females than males, and for darker-skinned subjects than lighter-skinned ones.
For darker-skinned women, error rates reached as high as 34.7 percent, while for light-skinned men, they never exceeded 0.8 percent.
Bias in facial recognition algorithms stems from inadequate diversity in the training datasets, with many datasets not sufficiently diverse to enable the algorithms to learn to correctly identify non-white faces.
The problem is rooted in the historical underrepresentation of people of color and women in largescale datasets used to train and benchmark these systems.
Additionally, there has been a widespread lack of attention paid by model designers to measuring performance disparities for historically marginalised groups.
Bias in facial recognition technology are seen to perpetuate and amplify existing racial and gender inequalities, with a real potential impact on law enforcement, immigration and other sectors, where misidentification can lead to wrongful arrests, lengthy detentions, and even deadly police violence.
Cognitive Services Face API
Watson Visual Recognition API
Operator:
Developer: IBM; Megvii; Microsoft
Country: USA
Sector: Govt - police; Govt - immigration
Purpose: Identify identity
Technology: Facial recognition
Issue: Accuracy/reliability; Bias/discrimination
Buolamwini J., Gebru T. Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification
https://www.nytimes.com/2018/02/09/technology/facial-recognition-race-artificial-intelligence.html
https://gizmodo.com/even-when-spotting-gender-current-face-recognition-tec-1822929750
https://fortune.com/2018/09/14/data-sheet-algorithmic-bias-buolamwini/
https://qz.com/1866848/why-ibm-abandoned-its-facial-recognition-program/
Page info
Type: Issue
Published: February 2025