ClothOff
ClothOff
Page published: February 2024 | Page last updated: August 2025
ClothOff is a controversial web and smartphone-based app that enables people to "undress" women.
The app creates an illusion of nudity by using deep learning algorithms to analyse images and generate nude representations based on clothed photographs.
ClothOff - and several other nudifier apps, including Deepsukebe - appears to be run by a team of people located primarily in Russia and Belarus operating through a web of secretive companies.
It is largely promoted using Telegram groups and X/Twitter, receiving over 9.4 million visitors in the final quarter of 2024, according to Similarweb.
Deepfake pornography
Deepfake pornography, or simply fake pornography, is a type of synthetic pornography that is created via altering already-existing photographs or video by applying deepfake technology to the images of the participants.
Source: Wikipedia 🔗
Website: ClothOff 🔗
Developer: Alaiksandr Babichau, Alexander German, Dasha Babicheva, Yevhen Bondarenko
Purpose: Nudify women
Type: Artificial intelligence
Technique: Deepfake; Deep learning; Neural network; Machine learning
ClothOff is shrouded in secrecy and is designed to evade accountability.
Ownership and management. A February 2024 Guardian investigation revealed that ClothOff is owned and run by Belarusian Dasha Babicheva and her brother Alaiksandr Babichau, both of whom are using AI-generated images to cover their identifies and hence evade accountability.
Location. ClothOff obscures the location of its operations by listing false addresses - for example, it has falsely claimed to be based in Buenos Aires, Argentina, and the British Virgin Islands. A 2025 data leak revealed that its employees are located in former Soviet Union countries, communicate in Russian, and that the company’s email service is based in Russia.
Payment masking. ClothOff uses redirect payment sites to circumvent bans from mainstream payment platforms such as PayPal, which banned the app for violating its policies. These redirects disguise payments as sales of unrelated goods like flowers or photography lessons, enabling them to continue receiving money without detection. For example, online video game marketplaces G2A and Skinsback were being used to collect payments, with the sales disguised as if they were for downloadable gaming content, according to an investigation by Bellingcat.
Legal responsibility. Clothoff imposes terms that push legal responsibility onto users rather than on itself. Its terms do not explicitly prohibit the creation of illegal pornographic content but only their distribution, storage, or transmission - shifting accountability to users who upload photos.
User verification. ClothOff claims to restrict access to adults and prevent the development of images of minors, but there is no effective verification mechanism, and explicit content, including involving children, is available immediately after accepting the site’s terms.
Data use and storage. ClothOff states it does not store user data and disclaims responsibility for any generated images, pushing all ethical and legal burdens onto the users.
Media enquiries. ClothOff operators avoid direct contact or cooperation with journalists and investigators, using anonymous and pseudonymous representatives, who often invoke nondisclosure agreements.
Victim rights partnerships. ClothOff claims to collaborate with an organisation called ASU Label, which ostensibly supports victims of AI-related harm. However, investigations revealed a lack of transparency regarding ASU Label's governance and operations, and its relationship with ClothOff. Furthermore, ASU Label's website appears to have been generated by AI.
Clothoff is seen to pose significant risks and to have caused real harm to individuals, including children, and societies across the world.
Safety. Victims - many of whom are highly vulnerable - of ClothOff's technology have reported abuse, harassment, bullying, humiliation, loss of dignity, social isolation and other forms of psychological distress. They have also been accusations of sexual exploitation, financial blackmail and extortion, reputational damage, and other harms. Though it denies it, ClothOff has been linked to the creation and distribution of child sexual abuse material (CSAM).
Privacy. ClothOff has been criticised for significant systemic violations of privacy, including the use of its technology to undress minors without their permission or the permission of their parents or guardians. The fact that the app does not require age verification further exacerbates the issue.
October 2024. AI nudification bots swamp Telegram
August 2024. San Francisco City Attorney sues 16 nudification apps
October 2023. Westfield High School students hit by non-consensual nude deepfakes
September 2023. Almendralejo hit by AI-generated naked child images
UK Children's Commissioner (2025). “One day this could happen to me.” Children, nudification tools and sexually explicit deepfakes
Encode AI (2025). Child Deepfake Incident Map (USA)
Qiwei L. et al. Reporting Non-Consensual Intimate Media: An Audit Study of Deepfakes
Australian Human Rights Commission (2023). The Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 - Submission to the Senate Legal and Constitutional Affairs Committee
European Parliament (2023). Parliamentary question
Der Spiegel (2025). The Men Behind Deepfake Pornography
Balkan Insight (2024).‘Undressed’ by AI: Serbian Women Defenceless Against Deepfake Porn
The Guardian (2024). Revealed: the names linked to ClothOff, the deepfake pornography app
The Guardian (2024). Black Box: the hunt for ClothOff – the deepfake porn app
Bellingcat (2024). Behind a Secretive Global Network of Non-Consensual Deepfake Pornography
Graphika (2023). A Revealing Picture