AI nudification bots swamp Telegram

Occurred: October 2024

Telegram has seen a surge in the availability and use of AI-powered bots that create nude images of individuals, including minors, according to multiple investigations.

What happened

An October 2024 WIRED investigation discovered over 50 bots generating non-consensual fake nude images of women from regular photos, and drawing an estimated 4 million monthly users combined.

An earlier BalkanInsight (BIRN) investigation reported that tens of thousands of Telegram users in Serbia were sharing images of women ‘undressed’ by artificial intelligence. One such channel - for the ClothOff denudification app - had over 535,000 subscribers.

A February 2025 investigation by Nucleo found 23 Telegram bots actively creating AI-generated child sexual abuse material (CSAM), challenging the company's promises to crack down on such criminal content.

The victims often experience severe emotional distress, humiliation, fear and long-term psychological trauma due to the non-consensual nature of the images. In some cases, the images are used to extort money.

Why it happened

Telegram's platform supports a multitude of channels and bots that facilitate the sharing and creation of these images, with users incentivised through gamification elements in which they earn rewards for creating and sharing deepfake images.

Telegram reportedly does little or nothing to stop these channels or their use.

Meantime, many legal jurisdictions lack specific regulations addressing altered images created by AI, leaving victims vulnerable to exploitation without legal recourse.

What it means


The findings raise significant concerns about the implications for privacy and the psychological well-being of victims, particularly young girls and women.


The situation underscores the urgent need for up-to-date legislation that addresses the unique challenges posed by AI-generated content. 


While some initiatives have been proposed, such as the US Deepfake Accountability Act, most have not been signed into law and of those that have, questions about their effectiveness remain.

Deepfake pornography

Deepfake pornography, or simply fake pornography, is a type of synthetic pornography that is created via altering already-existing photographs or video by applying deepfake technology to the images of the participants.

Source: Wikipedia 🔗

System 🤖

Operator:
Developer:  
Country: Croatia; Kosovo; Montenegro; Serbia; USA
Sector: Media/entertainment/sports/arts
Purpose: Nudify women
Technology: Deepfake; Machine learning
Issue: Privacy; Safety