Company uses Marques Brownlee AI voice clone to promote product without consent
Company uses Marques Brownlee AI voice clone to promote product without consent
Occurred: October 2024
Page published: October 2024
A networking company used an AI-cloned replica of tech influencer Marques Brownlee’s voice to promote their product without his consent, highlighting the growing threat of "voicejacking" to professional reputations and the legal grey area surrounding AI-generated likenesses.
Networking product company Dot used an AI-generated clone of Brownlee's voice to promote their Dot Metal product in an Instagram ad.
Brownlee publicly called out the unauthorised use of his voice on social media, describing the company's actions as "scummy" and "shady".
The increasing accessibility and sophistication of AI voice cloning technology means that, with just a 15-second sample, some AI models can now generate convincing voice clones in minutes.
With thousands of hours of their voice available online, this technology presents a major risk to high-profile individuals like Brownlee, and to employers and others.
In this instance, Dot appears to have attemped to exploit the availability of AI voice cloning technology to capitalise on Brownlee's influence and credibility.
The incident highlights the extent to which the misuse of AI technologies jeopardises individuals and companies, and raises questions about the ethics and legality of using someone's voice or other likeness without their permission.
It also underlines the need for legislation, such as the NO FAKES Act or similar "Right to Publicity" expansions, to specifically protect digital likenesses and voices from unauthorized commercial exploitation.
Unknown
Developer:
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Sell product
Technology: Deepfake
Issue: Authenticity/integrity; Consent
AIAAIC Repository ID: AIAAIC1784