Georgia-based group deepfake celebrity ads push fake crypto schemes
Georgia-based group deepfake celebrity ads push fake crypto schemes
Occurred: March 2025
Report incident 🔥 | Improve page 💁 | Access database 🔢
A Georgia-based criminal network scammed over 6,000 victims globally out of USD 35 million (GBP 27 million) using deepfake videos and fabricated news articles featuring celebrities to promote fraudulent cryptocurrency and investment schemes.
Operating from call centers in Tbilisi, Georgia, the group created fake ads on Facebook and Google featuring manipulated endorsements from UK celebrities Martin Lewis, Zoe Ball, and Ben Fogle.
Victims were lured into transferring savings to sham investment platforms, with UK citizens losing approximately GBP 9 million.
Leaked data shared with the Guardian and other news publishers by Swedish television channel SVT and the Organized Crime and Corruption Reporting Project (OCCRP) revealed scammers used highly aggressive tactics, including impersonating authorities to extract additional fees from victims trying to recover fund.
Some victims reported losing their life savings and having suicidal thoughts. Celebrities faced reputational damage and personal distress, with Ben Fogle noting his mother was targeted.
The fraud exploited gaps in social media platforms’ ad verification systems, despite policies prohibiting deceptive content.
Affiliate marketers paid to distribute fake articles and deepfakes on Meta and Google platforms, while the Georgian operatives benefited from lax local enforcement.
Regulatory delays to the UK’s Online Safety Act, which contains provisions on scam ads not taking effect until 2026, further enabled the schemes.
The case raises questions about global enforcement against cross-border cybercrime networks and the ethical responsibilities of platforms profiting from ad revenue.
It also highlights the need for technology companies to proactively detect and remove scam content, rather than relying on post-hoc reporting.
Deepfake
Deepfakes (a portmanteau of 'deep learning' and 'fake'[1]) are images, videos, or audio that have been edited or generated using artificial intelligence, AI-based tools or AV editing software.
Source: Wikipedia 🔗
Operator:
Developer:
Country: Australia; Bulgaria; Canada; Cyprus; Ireland, South Africa; Spain; UK; Ukraine
Sector: Banking/financial services
Purpose: Defraud
Technology: Deepfake; Generative AI; Machine learning
Issue: Fraud
Page info
Type: Incident
Published: April 2025