Study: BDD100K dataset is worse at spotting people with darker skin
Study: BDD100K dataset is worse at spotting people with darker skin
Occurred: March 2019
Report incident 🔥 | Improve page 💁 | Access database 🔢
The BDD100K self-driving video dataset is less effective at detecting individuals with darker skin tones, according to researchers.
The BDD100k dataset was discovered by Georgia Institute of Technology researchers to contain a higher percentage of "dark skin" labels, resulting in reduced accuracy and reliability in autonomous driving systems, particularly in scenarios involving individuals with darker skin tones.
By hiring people to manually apply labels according to skin colour based on the Fitzpatrick scale, a scale commonly used to classify human skin colour, the researchers found that BDD100K is, on average, 4.8 percent more accurate at correctly spotting light-skinned pedestrians, and up to 12 per cent worse at spotting people with darker skin.
The finding highlighted the need for more diverse and inclusive datasets.
Llorca D.F. et al (2023). Attribute Annotation and Bias Evaluation in Visual Datasets for Autonomous Driving
Wilson B., Hoffman J., Morgenstern J. (2019). Predictive Inequity in Object Detection (pdf)
https://mashable.com/article/self-driving-cars-trouble-detecting-darker-skin
https://www.theguardian.com/technology/shortcuts/2019/mar/13/driverless-cars-racist
https://www.mhlnews.com/technology-automation/article/22055534/researchers-say-autonomous-vehicles-could-be-racist
Page info
Type: Issue
Published: June 2024