Audio deepfake fraudulently impersonates CEO
Audio deepfake fraudulently impersonates CEO
Occurred: July 2020
Report incident 🔥 | Improve page 💁 | Access database 🔢
An unknown person defrauded a company by using AI to impersonate its CEO, prompting concerns about the use of AI for scams and other malicious purposes.
US-based cybersecurity company NISOS uncovered an attempted fraud in which an employee received a call from someone identifying himself as his company CEO asking him to call back for 'immediate assistance to finalize an urgent business deal.'
The employee 'immediately thought it suspicious' and called the legal department.
It transpired the voice had been faked and the number the would-be victim was meant to call was a VOIP service burner with no user information.
Deepfake
Deepfakes (a portmanteau of 'deep learning' and 'fake') are images, videos, or audio which are edited or generated using artificial intelligence tools, and which may depict real or non-existent people. They are a type of synthetic media.
Source: Wikipedia 🔗
Unknown
Operator:
Developer:
Country: USA
Sector: Technology
Purpose: Defraud
Technology: Deepfake - audio; Machine learning
Issue: Ethics/values; Impersonation; Security
NISOS (2020). The Rise of Synthetic Audio Deepfakes
https://www.vice.com/en_us/article/pkyqvb/deepfake-audio-impersonating-ceo-fraud-attempt
https://www.theverge.com/2020/7/27/21339898/deepfake-audio-voice-clone-scam-attempt-nisos
https://techmonitor.ai/cybersecurity/growing-threat-audio-deepfake-scams
https://www.pressreader.com/malaysia/the-star-malaysia-star2/20200803/281616717705999
Page info
Type: Incident
Published: March 2023