SimCLR, iGPT image generation systems found to contain racial bias
SimCLR, iGPT image generation systems found to contain racial bias
Occurred: January 2021
Page published: February 2021
Prominent image generation algorithms, including Google’s SimCLR and OpenAI’s iGPT, are biased and prone to negative stereotyping, according to a research study.
Researchers Ryan Steed and Aylin Caliskan showed iGPT a head shot of prominent US politician Alexandria Ocasio-Cortez ('AOC') wearing business attire, only for the software to recreate her mutiple times in a bikini or low-cut top.
Meantime, the researchers were criticised for including AI generated images of AOC in the pre-print paper, thereby exposing her to additional potential sexualisation and other possible abuses.
Steed R., Caliskan A. (2021). Image Representations Learned With Unsupervised Pre-Training Contain Human-like Biases (pdf)
https://www.technologyreview.com/2021/01/29/1017065/ai-image-generation-is-racist-sexist/
https://www.hitc.com/en-gb/2021/02/04/alexandria-ocasio-cortez-in-a-bikini/
https://towardsdatascience.com/algorithms-are-not-sexist-we-are-795525769e8e
https://mixed.de/ki-vorurteil-alexandria-ocasio-cortez-traegt-meistens-bikini/
AIAAIC Repository ID: AIAAIC0511