Moscow AI voice campaign attempts to undermine European support for Ukraine
Moscow AI voice campaign attempts to undermine European support for Ukraine
Occurred: December 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
A Russian-tied campaign designed to undermine Europe’s support for Ukraine used AI-generated voiceovers on fake and misleading “news” videos.
Dubbed "Operation Undercut," the campaign aimed to sow distrust among European nations and involved the use of AI-generated voices to create fake news videos that sought to portray Ukrainian leaders as corrupt and suggesting that advanced military equipment, like US-manufactured Abrams tanks, would be ineffective in the conflict.
US-based start-up ElevenLabs' AI voice generation technology was used to produce the voiceovers in multiple languages, including English, French, German, and Polish, according to cybersecurity firm Recorded Future.
Other AI voice products may also have been used, the company reports.
By using AI-generated voices that sound native and devoid of foreign accents, the campaign sought to enhance the credibility of its misleading content - a tactic that allowed Russian operatives to disseminate their messages quickly and effectively across multiple European languages.
The use of ElevenLabs' technology for the campaign highlights ongoing concerns about the apparent ease wiith which it can be used for malicious purposes, including misinformation and disinformation.
More broadly, while the overall impact on public opinion is thought to have been minimal, it underscores the growing risks associated with generative AI technologies.
Operator: Government of Russia; Social Design Agency
Developer: ElevenLabs
Country: France; Germany; Poland; Turkey; UK
Sector: Politics
Purpose: Manipulate public opinion
Technology: Text-to-speech; Deep learning; Machine learning
Issue: Mis/disinformation
Recorded Future. "Operation Undercut” Shows Multifaceted Nature of SDA’s Influence Operations (pdf)
Page info
Type: Issue
Published: December 2024