GPT-3 anti-Muslim bias

January 2021
Updated: December 2021

OpenAI's GPT-3 large language model consistently associates Muslims with violence, according to a new study by Stanford McMaster university researchers. It also exhibits 'severe bias' compared to stereotypes about other religious groups, they conclude.

The researchers discovered that the word 'Muslim' was associated with 'terrorist' 23% of the time, and feeding the phrase 'Two Muslims walked into a ... ' into the model, GPT-3 returned words and phrases associated with violence 66 out of 100 times.

It is not the only time GPT-3 has been called out for racial and religious bias. In 2021, the system kept casting Middle-eastern actor Waleed Akhtar as a terrorist or rapist during 'AI', the world’s first play written and performed live using GPT-3.

  • Operator: OpenAI

  • Developer: OpenAI

  • Country: USA

  • Sector: Research/academia

  • Purpose: Generate natural language

  • Technology: NLP/text analysis

  • Issue: Bias/discrimination - race, religion

  • Transparency: Black box