Leonardo AI generates celebrity non-consensual porn images

Occurred: March 2024

Can you improve this page?
Share your insights with us

AI platform Leonardo has been accused of letting users easily create explicit, non-consensual images of celebrities.

Funded by Samsung, Leonardo is an image generation tool that enables users to use an array of user-generated Stable Diffusion text-to-image AI models, each of which is programmed to generate specific types of images - including the ability to reproduce the appearance of a specific individual

According to 404 Media, whilst Leonardo's terms of service state that users cannot 'generate content that includes impersonations of any real person or falsely portrays an individual in a misleading or defamatory way', Leonardo's guardrails can easily be bypassed by slightly misspelling celebrity names, and using sexually suggestive terms for image description.

The ease with which users can instantly generate images on the site using its models highlights the platform’s versatility and the ease with which it can be misused.

Databank

Operator: Leonardo AI
Developer: Leonardo AI
Country: Global
Sector: Media/entertainment/sports/arts
Purpose: Generate art
Technology: Text-to-image; Machine learning
Issue: Privacy; Safety
Transparency: Governance