San Francisco City Attorney sues 16 denudification apps
Occurred: August 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
The San Francisco City Attorney filed a lawsuit against 16 websites that offer "denudification" or "deepnude" services, which use AI to remove clothing from images of women without their consent.
According to the suit, the websites were developed and deployed by an assortment of companies and individuals in Estonia, the UK, Ukraine, the USA and elsewhere, typically operating under fictitious names. Collectively, the sites were visited over 200 million times in the first six months of 2024, it says.
Nudification websites allow users to upload clothed images of real people, which are then processed by AI to create realistic-looking nude images - usually without their knowledge or consent. These types of sites are known to result in a wide range of harms, including causing considerable anxiety and distress for those targeted, harassment and bullying, loss of privacy,