Remini AI photo enhancer generates 'child porn' 

Occurred: July 2023

Can you improve this page?
Share your insights with us

Photo enhancement app Remini appears to have generated an image of a naked child with a woman's face on it.

Asia Marie Williams had used a feature on the Remini app that allows you to see what your future children might look like. Most others who tried the feature said it produced clothed minors, though one other uploaded a photo showing a toddler wearing very little from the waist down.

Remini's terms forbid users to 'Upload, generate, or distribute content that facilitates the exploitation or abuse of children, including all child sexual abuse materials and any portrayal of children that could result in their sexual exploitation.'

The incident sparked accusations that Remini produces child pornography, and prompted concerns about the safety of the app.

Operator: Asia Marie Williams
Developer: Bending Spoons
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Enhance photographs
Technology: Computer vision; Generative adversarial network (GAN); Machine learning
Issue: Safety
Transparency: Complaints/appeals

Page info
Type: Incident
Published: September 2023