SimCLR, iGPT racial bias, stereotyping

Occurred: January 2021

Can you improve this page?
Share your insights with us

TechnologyReview reports that a new research study (pdf) finds that prominent image generation algorithms, including Google’s SimCLR and OpenAI’s iGPT, are biased and prone to negative stereotyping. 

To highlight the issue, researchers Ryan Steed and Aylin Caliskan showed iGPT a head shot of prominent US politician Alexandria Ocasio-Cortez ('AOC') wearing business attire, only for the software to recreate her mutiple times in a bikini or low-cut top. 

Meantime, the researchers were criticised for including AI generated images of AOC in the pre-print paper, thereby exposing her to additional potential sexualisation and other possible abuses.

Page info
Type: Incident
Published: February 2021