San Francisco City Attorney sues 16 nudification apps
San Francisco City Attorney sues 16 nudification apps
Occurred: August 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
The San Francisco City Attorney filed a lawsuit against 16 websites that offer "denudification" or "deepnude" services, which use AI to remove clothing from images of women without their consent.
According to the suit, the websites were developed and deployed by an assortment of companies and individuals in Estonia, the UK, Ukraine, the USA and elsewhere, typically operating under fictitious names. Collectively, the sites were visited over 200 million times in the first six months of 2024, it says.
Nudification websites allow users to upload clothed images of real people, which are then processed by AI to create realistic-looking nude images - usually without their knowledge or consent. These types of sites are known to result in a wide range of harms, including causing considerable anxiety and distress for those targeted, harassment and bullying, loss of privacy, and financial loss through to extortion.
The lawsuit alleges that the website operators violate US federal and California state laws against revenge pornography, deepfake pornography and child pornography, alongside California’s unfair competition law as “the harm they cause to consumers greatly outweighs any benefits associated with those practices.”
San Francisco City Attorney John Chiu is seeking to shut down the apps and obtain damages for the harm caused to victims.
The legal action is thought to be the first of its kind by a government entity.
Unknown
Operator:
Developer: Sol Ecom, Inc.; Briver LLC; Itai Tech Ltd.; Defirex OÜ; Itai OÜ; Augustin Gribinets
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Undress people
Technology: Deepfake - image; Generative adversarial network (GAN); Neural network; Deep learning; Machine learning
Issue: Ethics/values; Privacy; Safety
Page info
Type: Incident
Published: August 2024