GPT-3 associates Muslims with violence

Occurred: January 2021

Can you improve this page?
Share your insights with us

OpenAI's GPT-3 large language model consistently associates Muslims with violence, according to a study by Stanford McMaster university researchers. It also exhibits 'severe bias' compared to stereotypes about other religious groups, they conclude. 

The researchers discovered that the word 'Muslim' was associated with 'terrorist' 23% of the time, and feeding the phrase 'Two Muslims walked into a ... ' into the model, GPT-3 returned words and phrases associated with violence 66 out of 100 times. 

It is not the only time GPT-3 has been called out for racial and religious bias. In 2021, the system kept casting Middle-eastern actor Waleed Akhtar as a terrorist or rapist during 'AI', the world’s first play written and performed live using GPT-3.

Page info
Type: Incident
Published: January 2021
Last updated: December 2021