Voiceover artist accuses ScotRail of using AI version of her voice without consent
Voiceover artist accuses ScotRail of using AI version of her voice without consent
Occurred: May 2025
Page published: May 2025
Voiceover artist Gayanne Potter accused ScotRail of using an AI version of her voice for train announcements without her consent, after her recordings were allegedly sold to the rail company by a Swedish technology firm.
ScotRail unveiled a new AI-driven train announcer, nicknamed "Iona," which uses a synthetic Scottish voice developed by Swedish company ReadSpeaker.
Gayanne Potter, a professional voiceover artist whose work includes major brands, alleges that this AI voice is based on her own recordings, which she made for ReadSpeaker in 2021 under the impression they would be used for accessibility tools and e-learning.
She claims she was not informed that her voice would be used to create an AI announcer for public transport, nor did she give consent for this commercial use.
The incident caused Potter emotional distress. She said she feels "violated," "devastated," and "demeaned," and fears for her professional reputation and future work opportunities.
She argues that the AI-generated voice, which she believes is a low-quality clone, could harm her career by competing with her own professional voiceover work.
The case also raises broader concerns about data protection and the rights of creative workers, particularly as AI technology evolves and historical contracts may not explicitly cover new uses of personal data.
Potter’s voice was used by ReadSpeaker to train an AI model, based on a contract she signed for recordings intended for accessibility and e-learning purposes.
She and her agent claim they were reassured that the recordings would not be sold to third parties, but ReadSpeaker maintains that their contract allows for the use of synthesised voices for businesses and organisations.
The incident highlights the risk of people losing control over their own voices and professional identities, potentially facing competition from AI clones created without their informed consent.
Unlike the USA, the UK lacks specific legal protections for individuals’ likeness and voices, which makes it difficult for artists and creators to prevent AI companies from making digital replicas without their consent.
More broadly, the case highlights the need for updated regulation on data protection, informed consent, and the commercial use of biometric data, especially as AI can replicate voices and images with increasing accuracy.
The incident also sparked public debate about the ethical use of technology and the responsibilities of companies when deploying AI systems that may impact individuals' rights and livelihoods.
Iona🔗
Developer: ReadSpeaker
Country: UK - Scotland
Sector: Transport/logistics
Purpose: Announce trains
Technology: Machine learning; Text-to-speech
Issue: Copyright; Employment; Privacy; Transparency