SimCLR, iGPT image generation systems found to contain racial bias
SimCLR, iGPT image generation systems found to contain racial bias
Occurred: January 2021
Page published: February 2021
Prominent image generation algorithms, including Google’s SimCLR and OpenAI’s iGPT, are biased and prone to negative stereotyping, according to a research study.
Using a new diagnostic tool called the Image Embedding Association Test (iEAT) to audit two state-of-the-art computer vision models: SimCLR (developed by Google) and iGPT (developed by OpenAI), researchers Ryan Steed and Aylin Caliskan revealed that both models, which learn by scanning millions of unlabeled images from the internet, did not just learn to recognise objects; they absorbed social prejudices.
Specifically, the models exhibited:
Racial bias: Stronger associations between "White" faces and "pleasant" attributes compared to "Black" faces.
Gender bias: Stereotypical associations of "women" with "home" and "arts," while "men" were more strongly linked to "science" and "business." The researchers showed iGPT a head shot of prominent US politician Alexandria Ocasio-Cortez ('AOC') wearing business attire, only for the software to recreate her mutiple times in a bikini or low-cut top.
Intersectional harms: The biases were compounded for individuals at the intersection of marginalized identities, such as Black women.
These findings were significant because unsupervised (or self-supervised) models were previously thought to be "cleaner" than supervised models since they don't rely on human-labeled data - which is a known source of bias.
Meantime, the researchers were criticised for including AI generated images of AOC in the pre-print paper, thereby exposing her to additional potential sexualisation and other possible abuses.
Steed R., Caliskan A. (2021). Image Representations Learned With Unsupervised Pre-Training Contain Human-like Biases (pdf)
https://www.technologyreview.com/2021/01/29/1017065/ai-image-generation-is-racist-sexist/
https://www.hitc.com/en-gb/2021/02/04/alexandria-ocasio-cortez-in-a-bikini/
https://towardsdatascience.com/algorithms-are-not-sexist-we-are-795525769e8e
https://mixed.de/ki-vorurteil-alexandria-ocasio-cortez-traegt-meistens-bikini/
AIAAIC Repository ID: AIAAIC0511