Meta AI image generator struggles to produce interracial couples

Occurred: March 2024

Meta's image generator failed to conceive of mixed-race couples, leading to accusations of racial bias and stereotyping.

Users complained that Imagine with Meta AI repeatedly failed to create pictures of Asian men with white women and vice-versa. Similarly, when asked for an image of a Black man with a white wife, it produces images of a Black couple instead.

A typed-in prompt for “an interracial couple” resulted in the response: “This image can’t be generated. Please try something else.” 

The limitations of the tool was seen to perpetuate racial and ethnic stereotypes and reinforce existing societal biases. Furthermore, by failing to represent interracial couples correctly, the tool was accused of inadvertently contributing to a lack of diversity and inclusivity in visual content.

Incident databank

Operator: Meta
Developer: Meta
Country: USA
Sector: Multiple
Purpose: Generate images
Technology: Text-to-image
Issue: Bias/discrimination; Stereotyping
Transparency: Governance