DeepNude nudification app provokes ethics, privacy controversy

Occurred: June 2019

DeepNude, an app that enabled users to strip any woman of her clothes and see her naked within a few seconds, faced a strong backlash from civil and privacy rights advocates. 

Launched in June 2019, DeepNude used generative adversarial network (GAN) to remove clothing from the images of women, without their consent. It had an unpaid and paid version, the latter costing USD 50 per year.

The app immediately proved controversial, with civil and privacy rights advocates complaining that it was intrusive, unethical, objectified women, and could be used for revenge porn and other nefarious purposes. 

The anonymous developer of the app told Vice that they were 'not a voyeur, I'm a technology enthusiast', and that they also wanted to create a male version. 

Two weeks later, the open source version of the app was removed from GitHub after the developer reputedly grappled with their ethical conscience.

However, other people quickly uploaded their own versions to GitHub and other platforms. 

System 🤖

Operator: DeepNude
Developer: Anonymous/pseudonymous
Country: Estonia; Global
Sector: Media/entertainment/sports/arts
Purpose: Undress women
Technology: Deepfake - image
Issue: Privacy; Ethics/values; Bias/discrimination - gender  
Transparency: Governance; Privacy

Page info
Type: Issue
Published: March 2023
Last updated: June 2024