Mary Nightingale likeness used in deepfake scam
Occurred: March 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
ITV news anchor Mary Nightingale expressed outrage after discovering that her likeness had been used in a deepfake video promoting a financial investment app.
The manipulated video, which surfaced on social media, depicted Nightingale as if she were presenting the ITV Evening News before promoting the app. Nightingale described the experience as "identity theft," emphasising the potential dangers of deepfakes in manipulating public trust, especially with upcoming elections.
The incident highlights the growing prevalence of deepfake technology, which has been used to create fake celebrity endorsements and could be exploited for scams or misinformation.
Nightingale's concerns reflect broader worries about the misuse of AI technologies, prompting calls for stronger regulations to address the risks associated with deepfakes.
Deepfake
Deepfakes (a portmanteau of 'deep learning' and 'fake') are images, videos, or audio which are edited or generated using artificial intelligence tools, and which may depict real or non-existent people. They are a type of synthetic media.
Source: Wikipedia 🔗
System 🤖
Unknown
Operator:
Developer:
Country: UK
Sector: Media/entertainment/sports/arts
Purpose: Defraud
Technology: