Engineer warns Microsoft Copilot Designer creates violent, sexual images

Occurred: December 2023-March 2024

Can you improve this page?
Share your insights with us

An AI engineer involved in testing Microsoft's Copilot Designer image generator tool found that it produced violent and sexist images, and appeared to violate copyright law. 

Shane Jones observed Copilot Designer 'easily' generate sexualised images of women, underage drinking and drug use, teenagers with assault rifles, religious stereotyping, political bias, and other inappropriate content. He also saw the tool produce images of Disney characters potentially violating copyright laws and Microsoft’s policies.  

However, Jones' attempts failed to persuade Microsoft to take down OpenAI's DALL-E, which powered the tool, and fix the problem failed to work, prompting him to post open letters to Microsoft's Board of Directors and US Federal Trade Commission chair Lina Khan.

The fracas was seen to illustrate poor governance of Copilot Designer, and a lack of transparency concerning the tool's risks and harms. Jones told CNBC that he had been told in meetings that Microsoft was only tackling the most serious issues, and that it did not have sufficient resources to investigate most risks and problematic outputs.

Databank

Operator: Microsoft
Developer: Microsoft
Country: USA
Sector: Technology
Purpose: Generate images
Technology: Text-to-image; Generative adversarial network (GAN); Neural network; Deep learning; Machine learning
Issue: Bias/discrimination - political; Copyright; Safety
Transparency: Governance

System


Research, advocacy


News, commentary, analysis