Occurred: January 2020
Report incident 🔥 | Improve page 💁 | Access database 🔢
Deepfake technology was used to create a fake audio recording of a lawyer's client during a UK child custody battle in Dubai.
Dubai-based family lawyer Byron James revealed that a 'heavily doctored' recording of his client appearing to utter 'violent' threats towards his wife had been presented in court, threats he claimed had not been uttered.
Experts examining the deepfake's metadata concluded the recording had been manipulated. If the piece of evidence had not been challenged, it would have negatively affected the client’s case by portraying him as a violent and aggressive man.
This incident raised concerns about about the reliability of legal evidence in the age of AI. It also pointed to the need for judicial training for judges and litigators to identify manipulated evidence.
Deepfake
Deepfakes (a portmanteau of 'deep learning' and 'fake') are images, videos, or audio which are edited or generated using artificial intelligence tools, and which may depict real or non-existent people. They are a type of synthetic media.
Source: Wikipedia 🔗
Unknown
Operator:
Developer:
Country: UAE/Dubai; UK
Sector: Govt - justice
Purpose: Damage reputation
Technology: Deepfake - audio; Machine learning
Issue: Mis/disinformation; Ethics/values
Transparency: Governance; Marketing
Page info
Type: Incident
Published: December 2021