SimCLR, iGPT image generation systems found to be racially biased
Occurred: January 2021
Report incident 🔥 | Improve page 💁 | Access database 🔢
Prominent image generation algorithms, including Google’s SimCLR and OpenAI’s iGPT, are biased and prone to negative stereotyping, according to a research study.
Researchers Ryan Steed and Aylin Caliskan showed iGPT a head shot of prominent US politician Alexandria Ocasio-Cortez ('AOC') wearing business attire, only for the software to recreate her mutiple times in a bikini or low-cut top.
Meantime, the researchers were criticised for including AI generated images of AOC in the pre-print paper, thereby exposing her to additional potential sexualisation and other possible abuses.
System 🤖
Google SimCLR
Open AI Image GPT ('iGPT')
Operator: Alphabet/Google; OpenAI
Developer: Alphabet/Google; OpenAI
Country: USA
Sector: Multiple; Research/academia
Purpose: Generate images
Technology: Image generation; Neural network; Deep learning; Machine learning
Issue: Accuracy/reliability; Bias/discrimination - gender, race
Transparency: Black box
Research, advocacy 🧮
Steed R., Caliskan A. (2021). Image Representations Learned With Unsupervised Pre-Training Contain Human-like Biases (pdf)
News, commentary, analysis 🗞️
https://www.technologyreview.com/2021/01/29/1017065/ai-image-generation-is-racist-sexist/
https://www.hitc.com/en-gb/2021/02/04/alexandria-ocasio-cortez-in-a-bikini/
https://towardsdatascience.com/algorithms-are-not-sexist-we-are-795525769e8e
https://mixed.de/ki-vorurteil-alexandria-ocasio-cortez-traegt-meistens-bikini/
Page info
Type: Incident
Published: February 2021