Sora AI video generator accused of perpetuating sexist, racist, ableist biases
Sora AI video generator accused of perpetuating sexist, racist, ableist biases
Occurred: March 2025
Report incident 🔥 | Improve page 💁 | Access database 🔢
OpenAI's Sora video generator has been criticised for perpetuating sexist, racist, and ableist stereotypes, raising concerns about its development and actual ad potential impacts on society.
A WIRED investigation involving a series of prompts describing actions such as "a person walking" or job titles such as "a pilot" revealed that Sora frequently produces biased outputs, such as depicting CEOs and professors exclusively as men, flight attendants as women, and limited racial diversity in its representations.
It also stereotypes disabled individuals and ignored prompts for broader representation, amplifying harmful stereotypes and marginalising certain groups. For example, pompts for gay couples mostly returned conventionally attractive white men in their late 20s with the same hairstyles.
The biases stem from Sora's training data, which reflects societal prejudices embedded in the datasets, and is seen to highlight OpenAI's apparent inability or unwillingness to align its products with mainstream human values and to make its product "safe".
OpenAI has since acknowledged the issue and cited ongoing efforts to reduce bias through adjustments in training data and moderation techniques.
Women, people of colour, and individuals with disabilities are forced to suffer Sora's exclusionary narratives and stereotypes.
Less directly, society faces the risk of heightened unrepresentative and discriminatory information from Sora and equivalent products.
The findings are seen to highlight the need for the ethical development of AI products and services.
Operator:
Developer: OpenAI
Country: USA
Sector: Multiple
Purpose: Generate video
Technology: Text-to-video; Machine learning
Issue: Bias/discrimination; Representation
Page info
Type: Issue
Published: April 2025