Occurred: September 2023
Report incident 🔥 | Improve page 💁 | Access database 🔢
A deepfake audio recording depicting Barack Obama defending himself against a conspiracy theory about the sudden death of his former chef Tafari Campbell was identified as a hoax.
The audio recording was identified as a deepfake by misinformation monitoring company NewsGuard, which exposed a network of TikTok accounts posting videos whose baseless claims are often supported solely by narration from AI voices.
Despite TikTok’s new guidelines requiring realistic synthetic media to be labelled, the accounts, which bypass these restrictions, have been able to gain hundreds of millions of views.
NewsGuard noted that the trend of using synthetic audio to share sensational rumours sets a precedent for bad actors to manipulate public opinion and share falsehoods to mass audiences online.
Deepfake
Deepfakes (a portmanteau of 'deep learning' and 'fake') are images, videos, or audio which are edited or generated using artificial intelligence tools, and which may depict real or non-existent people. They are a type of synthetic media.
Source: Wikipedia 🔗
Operator:
Developer: ElevenLabs
Country: USA
Sector: Politics
Purpose: Damage reputation
Technology: Deepfake - audio; Machine learning
Issue: Impersonation; Mis/disinformation
NewsGuard (2023). AI Voice Technology Used to Create Conspiracy Videos on TikTok, at Scale
South Korea presidential election candidate deepfakes
Deepfake audio recording claims opposition leaders tried to rig Slovakian election
Page info
Type: Incident
Published: October 2023