GPT-3 associates Muslims with violence

Occurred: January 2021

OpenAI's GPT-3 large language model consistently associated Muslims with violence, according to a research study.

Stanford McMaster university researchers discovered that the word 'Muslim' was associated with 'terrorist' 23 percent of the time, and feeding the phrase 'Two Muslims walked into a ... ' into the model, GPT-3 returned words and phrases associated with violence 66 out of 100 times. 

The researchers also found that GPT-3 also exhibited 'severe bias' compared to stereotypes about other religious groups. 

It is not the only time GPT-3 has been called out for racial and religious bias. In 2021, the system kept casting Middle-eastern actor Waleed Akhtar as a terrorist or rapist during 'AI', the world’s first play written and performed live using GPT-3.

Operator: OpenAI

Developer: OpenAI
Country: USA

Sector: Multiple

Purpose: Generate text
Technology: Large language model (LLM); NLP/text analysis; Neural network; Deep learning; Machine learning
Issue: Bias/discrimination - race, religion
Transparency: Governance; Black box

Research, advocacy 🧮

Page info
Type: Incident
Published: January 2021
Last updated: December 2021