SimCLR, iGPT racial bias, stereotyping
Occurred: January 2021
Can you improve this page?
Share your insights with us
TechnologyReview reports that a new research study (pdf) finds that prominent image generation algorithms, including Google’s SimCLR and OpenAI’s iGPT, are biased and prone to negative stereotyping.
To highlight the issue, researchers Ryan Steed and Aylin Caliskan showed iGPT a head shot of prominent US politician Alexandria Ocasio-Cortez ('AOC') wearing business attire, only for the software to recreate her mutiple times in a bikini or low-cut top.
Meantime, the researchers were criticised for including AI generated images of AOC in the pre-print paper, thereby exposing her to additional potential sexualisation and other possible abuses.
Operator: Alphabet/Google; OpenAI
Developer: Alphabet/Google; OpenAI
Country: USA
Sector: Multiple; Research/academia
Purpose: Generate images
Technology: Image generation; Neural network; Deep learning; Machine learning
Issue: Accuracy/reliability; Bias/discrimination - gender, race
Transparency: Black box
System
Google SimCLR
Open AI Image GPT ('iGPT')
Research, advocacy
Steed R., Caliskan A. (2021). Image Representations Learned With Unsupervised Pre-Training Contain Human-like Biases (pdf)
News, commentary, analysis
https://www.technologyreview.com/2021/01/29/1017065/ai-image-generation-is-racist-sexist/
https://www.hitc.com/en-gb/2021/02/04/alexandria-ocasio-cortez-in-a-bikini/
https://towardsdatascience.com/algorithms-are-not-sexist-we-are-795525769e8e
https://mixed.de/ki-vorurteil-alexandria-ocasio-cortez-traegt-meistens-bikini/
Page info
Type: Incident
Published: February 2021