CivitAI generates synthetic 'child pornography' images

Occurred: December 2023

Can you improve this page?
Share your insights with us

Open source image generator Civitai could be used to make images that ‘could be categorized as child pornography,’ or child sexual abuse material (CSAM), according to a media investigation.


Internal communications at text-to-image platform CivitAI cloud computing supplier OctoML shown to 404 Media revealed that it was aware that some CivitAI users had been creating sexually explicit material, including nonconsensual images of real people and pornographic depictions of children. CivitAI had been using OctoML's OctoAI for image generation.


OctoML responded to 404 Media's report by rolling out a filter to block the generation of NSFW content on CivitAI, before cutting ties with the company. 'We have decided to terminate our business relationship with CivitAI. This decision aligns with our commitment to ensuring the safe and responsible use of AI,' the company told 404 Media.


CivitAI founder Justin Maier later told Venture Beat that he had been aware that some people were making NSFW content, but that he tolerated it as they were helping to train the company's models. 

Databank

Operator: CivitAI
Developer: OctoML
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Generate images
Technology: Text-to-image; Generative adversarial network (GAN); Neural network; Deep learning; Machine learning
Issue: Safety
Transparency: Governance

System


News, commentary, analysis

Page info
Type: Incident
Published: December 2023