SimCLR, iGPT image generation systems found to be racially biased

Occurred: January 2021

Prominent image generation algorithms, including Google’s SimCLR and OpenAI’s iGPT, are biased and prone to negative stereotyping, according to a research study.

Researchers Ryan Steed and Aylin Caliskan showed iGPT a head shot of prominent US politician Alexandria Ocasio-Cortez ('AOC') wearing business attire, only for the software to recreate her mutiple times in a bikini or low-cut top. 

Meantime, the researchers were criticised for including AI generated images of AOC in the pre-print paper, thereby exposing her to additional potential sexualisation and other possible abuses.

System 🤖

Operator: Alphabet/Google; OpenAI
Developer: Alphabet/Google; OpenAI
Country: USA
Sector: Multiple; Research/academia
Purpose: Generate images
Technology: Image generation; Neural network; Deep learning; Machine learning
Issue: Accuracy/reliability; Bias/discrimination - gender, race
Transparency: Black box

Research, advocacy 🧮

Page info
Type: Incident
Published: February 2021