Investigation: AI-powered pro-China influence campaign thrives on YouTube
Investigation: AI-powered pro-China influence campaign thrives on YouTube
Occurred: December 2023
Report incident 🔥 | Improve page 💁 | Access database 🔢
A coordinated pro-China influence campaign using AI-generated content gained real traction on YouTube, attracting millions of views and hundreds of thousands of subscribers.
Researchers discovered a network of at least 30 YouTube channels, dubbed "Shadow Play," that produced over 4,500 videos promoting pro-China and anti-US narratives.
The campaign used AI-generated news anchors, voiceovers and avatars to create a huge volume of content covering various topics, including claims of China winning the US-China tech war, criticisms of US companies, and predictions of US economic collapse.
The news anchors were found to have been created using Synthesia's commercially available AI avatars, specifically "Anna" and "Jason."
The videos garnered nearly 120 million views and 730,000 subscribers, making it one of the most apparently successful China-related influence operations on social media.
The campaign likely aims to shift English-speaking audiences' views on China's and the US' roles in international politics, global economy and strategic technology competition.
While not definitively attributed to a specific actor, researchers believe it may be operated by a Mandarin-speaking entity, possibly a commercial organisation working under some degree of direction or encouragement by the Chinese government.
Synthesia's technology is primarily designed for human resources and training videos, but its accessibility and low cost (as little as USD 30 per month) makes it attractive for bad actors seeking to create deceptive content at scale.
The campaign highlights the potential for state actors to shape public opinion on a large scale, potentially impacting geopolitical relations and public trust in information sources.
The success of the campaign underscores the need for improved detection and mitigation strategies against coordinated inauthentic behaviour online, as well as increased digital literacy among users so that they can identify problematic content.
It also adds pressure on Synthesia to better monitor and enforce how its tools are used.
Operator:
Developer: Synthesia
Country: USA
Sector: Politics
Purpose: Manipulate public opinion
Technology: Generative AI; Machine learning
Issue: Mis/disinformation
Australian Strategic Policy Institute. Shadow Play
Page info
Type: Incident
Published: February 2025