Michel Janse deepfake used for advert without consent
Occurred: March 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
Michel Janse, a Christian social media influencer, had her face likeness used in a YouTube advert without her consent.
The ad featured Janse’s face “in her bedroom, wearing her clothes” to sell erectile dysfunction pills. Janse Experts speculated the advertisement had been generated by an AI trained on Janse’s regular posts on travel, home decor and wedding planning.
Janse complained to YouTube, which took the advert down.
The incident highlights how AI technologies are being used to clone people’s likeness and use them in digital ads. It also underscores the potential harm and violation of personal rights that can occur when personal images are used to train AI systems without consent.
System 🤖
Unknown
Operator: YouTube
Developer:
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Generate video
Technology: Deepfake - video
Issue: Personality rights; Mis/disinformation; Privacy
Transparency: Governance
News, commentary, analysis 🗞️
https://www.tiktok.com/@michel.c.janse/video/7343855927323266346?lang=en
https://www.washingtonpost.com/technology/2024/03/28/ai-women-clone-ads/
https://www.dailydot.com/news/womans-likeness-stolen-by-ai-deepfake/
https://www.unilad.com/community/ai-artificial-intelligence-steals-identity-advert-692192-20240417
https://www.distractify.com/p/company-used-womans-ai-likeness-commercial
Page info
Type: Incident
Published: April 2024