ClothOff - AI nudifier
ClothOff - AI nudifier
Report incident 🔥 | Improve page 💁 | Access database 🔢
ClothOff is a controversial web and smartphone-based app that invites people to 'undress anyone using AI' for approximately GBP 8.50 (USD 11.30) for 25 credits.
Using advanced deep learning algorithms to analyse images and generate nude representations based on clothed photos, the app creates the illusion of nudity.
ClothOff appears to be run out of Belarus and has been largely promoted using Telegram groups via a series of secretive companies in Delaware, Russia, New Zealand, the UK and elsewhere.
The app received over 9.4 million visitors in the final quarter of 2024, according to Similarweb.
Deepfake pornography
Deepfake pornography, or simply fake pornography, is a type of synthetic pornography that is created via altering already-existing photographs or video by applying deepfake technology to the images of the participants.
Source: Wikipedia 🔗
Website: ClothOff 🔗
Status: Active
Released:
Developer: Alaiksandr Babichau, Alexander German, Dasha Babicheva, Yevhen Bondarenko
Purpose: Nudify women
Type: Nudifier
Technique: Deepfake; Machine learning
Like other nudifier services, ClothOff is shrouded in secrecy and has been subject to media and police investigations into its shady ownership, management and impacts.
Governance. A February 2024 Guardian investigation revealed that ClothOff is run by Belarusian Dasha Babicheva and her brother Alaiksandr Babichau, both of whom are using AI-generated images to cover their identifies.
Supply chain. Online video game marketplaces G2A and Skinsback was being used to collect payments for Clothoff and a number of similar platforms, with the sales disguised as if they were for downloadable gaming content, according to an investigation by Bellingcat.
Partnerships. Clothoff claims to collaborate with an organisation called ASU Label, which ostensibly supports victims of AI-related harm. However, investigations revealed a lack of transparency regarding ASU Label's governance and operations, and its relationship with Clothoff. Furthermore, its website appears to have been generated by AI.
Privacy. ClothOff has been criticised for significant violations of privacy, including its use by minors to undress other minors. The app does not require age verification, further exacerbating the issue.
Safety. Victims of Clothoff have reported harassment, humiliation, severe psychological harm, and reputational damage. The app has also been linked to the creation and distribution of child sexual abuse material (CSAM).
Ethics. The app's terms and conditions do not mention consent, nor do they classify the creation of illegal pornographic material as a violation, raising serious ethical and moral questions about its owners and governance.
October 2024. AI nudification bots swamp Telegram
October 2023. Westfield High School students hit by non-consensual nude deepfakes
September 2023. Almendralejo hit by AI-generated naked child images
European Parliament. Parliamentary question
The Tech Transparency Project. Nonconsensual Sexual Deepfake App Verified by X
Qiwei L. et al. Reporting Non-Consensual Intimate Media: An Audit Study of Deepfakes
Balkan Insight (2024).‘Undressed’ by AI: Serbian Women Defenceless Against Deepfake Porn
The Guardian (2024). Revealed: the names linked to ClothOff, the deepfake pornography app
The Guardian (2024). Black Box: the hunt for ClothOff – the deepfake porn app
Bellingcat (2024). Behind a Secretive Global Network of Non-Consensual Deepfake Pornography
Graphika (2023). A Revealing Picture
Page info
Type: System
Published: February 2024
Last updated: April 2025