Company uses Marques Brownlee AI voice clone to promote product without consent
Company uses Marques Brownlee AI voice clone to promote product without consent
Occurred: October 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
A company cloned the voice of YouTuber Marques Brownlee and used it promote one of their products his consent, triggering a public backlash.
Networking product company Dot used an AI-generated clone of Brownlee's voice to promote their Dot Metal product in an Instagram ad.
Brownlee publicly called out the unauthorised use of his voice on social media, describing the company's actions as "scummy" and "shady".
The increasing accessibility and sophistication of AI voice cloning technology means that, with just a 15-second sample, some AI models can now generate convincing voice clones in minutes.
With thousands of hours of their voice available online, this technology presents a major risk to high-profile individuals like Brownlee, and to employers and others.
In this instance, Dot appears to have attemped to exploit the availability of AI voice cloning technology to capitalise on Brownlee's influence and credibility.
The incident highlights the extent to which the misuse of AI technologies jeopardises individuals and companies, and raises questions about the ethics and legality of using someone's voice or other likeness without their permission.
It also undercores the need for better safeguards and policies to protect individuals from unauthorised AI-generated replicas.
Audio deepfake
An audio deepfake (also known as voice cloning or deepfake audio) is a product of artificial intelligence used to create convincing speech sentences that sound like specific people saying things they did not say.
Source: Wikipedia 🔗
Unknown
Operator:
Developer:
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Sell product
Technology: Deepfake - audio
Issue: Ethics/values
Page info
Type: Incident
Published: October 2024