Stable Diffusion generates job type gender, racial stereotypes

Occurred: June 2023

A Bloomberg test has discovered that the Stable Diffusion image generator produces content that is full of gender and racial stereotypes when it renders people in 'high-paying' and 'low-paying jobs.' 

Stable Diffusion was asked to generate 5,100 images from written prompts relating to job titles in 14 fields, as well as three categories relating to crime. Analysed against the Fitzpatrick Scale, the tool generated nearly three times as many images of men than women when asked to categorise job-related images by gender.

In addition, images generated for high-paying jobs such as architects, lawyers, and doctors were dominated by lighter skin tones, whereas low-paying jobs like janitors, dishwashers and social workers were dominated by darker skin tones. 

And the great majority of results for drug dealers and prison inmates were darker-skinned, whilst terrorists tended to be men with dark facial hair wearing head coverings.

The findings suggest the tool is regularly reinforcing and amplifying cultural stereotypes.

Operator: Stability AI; Canva; Deep Agency
Developer: Stability AI
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Generate images
Technology: Text-to-image; Generative adversarial network (GAN); Neural network; Deep learning; Machine learning
Issue: Bias/discrimination - race, ethnicity, gender
Transparency: Governance 

Investigations, assessments, audits 🧐

Page info
Type: Incident
Published: June 2023
Last updated: November 2023