Fake AI videos amplify Myanmar earthquake disinformation
Fake AI videos amplify Myanmar earthquake disinformation
Occurred: March 2025
Report incident π₯ | Improve page π | Access database π’
A series of AI-generated videos falsely depicting catastrophic destruction from the March 2025 earthquake in Myanmar circulated widely on social media, causing panic and confusion.
Hyper-realistic AI-fabricated footage exploiting a 7.7 magnitude earthquake that killed over 2,700 people in Myanmar showed collapsed bridges, massive craters, and ruined urban areas, and falsely claimed that 17 million people were affected.Β
Social media algorithms amplified the content, leading to rapid sharing by journalists and influencers before being verified as fake.
Watermarks indicate the videos were likely created using tools like Runway AI - which often have few guar. Motivations ranged from chasing social media engagement to intentionally sowing fear or undermining trust in authorities15. Limited access to verified information from Myanmar due to military media controls created fertile ground for disinformation
The incident highlights the escalating threat of AI-generated misinformation and disinformation during crises, where emotions often override objectivity.Β
It also underscores the urgent need for improved detection tools, platform accountability for content verification, and public media literacy programmes to counter the risks of AI-powered fake content.
Deepfake
Deepfakes (a portmanteau of 'deep learning' and 'fake'[1]) are images, videos, or audio that have been edited or generated using artificial intelligence, AI-based tools or AV editing software.
Source: Wikipedia π
MINIMAX Hailuo AI π
Runway π
Operator:Β
Developer: MiniMax; Runway
Country: Myanmar
Sector: Multiple
Purpose: Scare/confuse/destabilise
Technology: Deepfake; Machine learning
Issue: Mis/disinformation
Page info
Type: Incident
Published: April 2025