Occurred: June 2019
Report incident 🔥 | Improve page 💁 | Access database 🔢
DeepNude, an app that enabled users to strip any woman of her clothes and see her naked within a few seconds, faced a strong backlash from civil and privacy rights advocates.
Launched in June 2019, DeepNude used generative adversarial network (GAN) to remove clothing from the images of women, without their consent. It had an unpaid and paid version, the latter costing USD 50 per year.
The app immediately proved controversial, with civil and privacy rights advocates complaining that it was intrusive, unethical, objectified women, and could be used for revenge porn and other nefarious purposes.
The anonymous developer of the app told Vice that they were 'not a voyeur, I'm a technology enthusiast', and that they also wanted to create a male version.
Two weeks later, the open source version of the app was removed from GitHub after the developer reputedly grappled with their ethical conscience.
However, other people quickly uploaded their own versions to GitHub and other platforms.
DeepNude
Operator: DeepNude
Developer: Anonymous/pseudonymous
Country: Estonia; Global
Sector: Media/entertainment/sports/arts
Purpose: Undress women
Technology: Deepfake - image
Issue: Privacy; Ethics/values; Bias/discrimination - gender
https://www.theregister.com/2019/06/27/deepfake_nudes_app_pulled/
https://www.vice.com/en/article/kzm59x/deepnude-app-creates-fake-nudes-of-any-woman
https://www.vice.com/en/article/8xzjpk/github-removed-open-source-versions-of-deepnude-app-deepfakes
https://medium.com/syncedreview/deepfake-nudie-app-goes-viral-then-shuts-down-577e8c168dfb
https://www.theguardian.com/commentisfree/2019/jun/29/deepnude-app-week-in-patriarchy-women
https://www.theregister.co.uk/2019/07/09/github_deepnude_code_discord/
Page info
Type: Issue
Published: March 2023
Last updated: June 2024