AI Voices Are Built on Stolen Data

Your voice is not free data. AI voice tech is cloning voices without permission. VoiceProductions stands up for voice actors and ethical usage.
Many AI voice-over platforms are built on data they never owned. Not just the voices of public figures, but also of everyday people. Content scraped from podcasts, films, audiobooks, online tutorials. Often unattributed. Frequently unpaid.
Even nonprofit initiatives haven’t been spared.
The Misuse of LibriVox Voices
LibriVox, for example, launched in 2005, and is a nonprofit project. Volunteers record public domain books, often for use by the blind or visually impaired. Their mission is access. Inclusion. Community.
Today, the archive hosts over 30,000 audiobooks. That’s tens of thousands of hours of recorded human voice. All donated freely, without expectation of payment. And crucially, without anticipation of AI replication.
Yet these recordings are being ingested without permission. Used to train commercial voice models. Not for accessibility. But for monetization.
The Ethical Grey Zone
AI companies tap into public domain sources. They don’t notify the creators. They don’t sign agreements. They don’t offer compensation.
Most of these recordings predate modern AI. Contracts don’t account for machine learning. Volunteers never imagined their work would be reprocessed into commercial voice clones.
Legally? There’s ambiguity. Ethically? There is none.
A voice is not generic data. It is unique. It is expressive. It is, in many cases, protected.
Public domain should not be a loophole for private gain.
A Legal Storm Is Coming
The first lawsuits have already started. But this is only the beginning.
Lawyers are sounding the alarm. They call it a ticking time bomb: thousands of hours of voice recordings, donated in good faith, now silently absorbed into commercial AI datasets. Not for education. Not for accessibility. But for profit. For voice tech designed to replace the very people who built it.
Once these practices become visible, the backlash will be fierce.
Voice actors and unions are preparing. And the central questions are razor-sharp:
- Can you truly give consent when AI didn’t exist at that point?
- Does an AI company have the right to capture and own your voiceprint?
This is no longer a semantic debate. This is legal territory.
Because public domain means open access, not open ownership. Especially not for tech companies that process that data behind closed doors. These are one-way systems. They don’t credit. They don’t pay. They don’t disclose.
What to expect?
Major lawsuits, led by voice actors
New laws around data usage, consent, and repurposing
Landmark rulings that will reshape how AI companies operate
The wave is coming. And those profiting from grey areas today will soon have to answer for it.
What happens when AI steals your voice?
Can AI use your voice without permission?
That’s the question voice actor Gayanne Potter was forced to ask. Her voice was suddenly heard on ScotRail trains through a new AI announcement system called 'Iona', and she had never signed off on that.
The voice had been cloned, trained, and deployed. All without her consent.
From E-learning to Commercial AI
Back in 2021, Potter recorded voiceovers for ReadSpeaker. The agreement was clear: the recordings were strictly for e-learning and accessibility purposes. Not for commercial ventures. Not for mass public infrastructure. And certainly not for AI systems heard across a national railway.
Yet her voice showed up on trains. Repurposed. Repackaged. Renamed. As “Iona,” an AI-generated voice trained on her original recordings. She had no idea this was happening. In an interview with the BBC, Potter said she felt shocked, betrayed, and violated.
Her voice wasn’t just used. It was reengineered. The synthetic copy now directly competes with her own career. It speaks without her knowledge, under her name, in her country.
Legal Battle Incoming?
Together with lawyer Mathilde Pavis, Potter is preparing legal action. The core issue is simple: even if AI wasn't explicitly named in her contract, this kind of use was never agreed to. And legally, it doesn’t hold up. Pavis puts it clearly:
“The remuneration needs to be proportionate to the value of the commercial transaction.”
That hasn’t happened. Not even close. Potter’s team argues that her voice qualifies as biometric data. Data that cannot legally be processed or monetized without explicit, informed consent.
Political Attention in Scotland
The British performers' union Equity called it “pure exploitation.” In May 2025, the issue was formally raised in the Scottish Parliament. First Minister John Swinney acknowledged the seriousness of the matter. ScotRail stated that the conflict lies between Potter and ReadSpeaker. Meanwhile, the AI voice system continues running.
ScotRail Distances Itself
ScotRail claims it isn’t responsible. The company insists the matter falls solely on ReadSpeaker. As long as there’s no legal block, the AI voice remains in operation. That only increases frustration among Potter’s supporters. As long as 'Iona' speaks, her voice remains co-opted, without approval.
What Happens Next?
Potter is collecting evidence. She has filed a formal complaint with the UK’s Information Commissioner’s Office (ICO). Alongside her legal action, she’s also becoming a voice for change. Advocating for tighter regulation. Fighting for consent and clarity. Whether you're a Hollywood icon or a volunteer narrator, your voice should not be taken.
Stop the Unauthorised Use of AI Voices
Globally, voice actors are raising the alarm. Their voices are being cloned, repurposed, and monetized. No agreement. No notice. No compensation.
One of the most visible cases? Scarlett Johansson.
In September, she declined OpenAI’s invitation to lend her voice to ChatGPT. In May 2024, the company launched “Sky”, an AI voice that sounded eerily like her role in the film Her.
Johansson was stunned. “I was shocked, angered and in disbelief,” she later said. OpenAI CEO Sam Altman had approached her twice. Even just days before the public release of GPT-4o.
She said no. But the world heard something very close to yes.
To make matters worse, Altman tweeted a single word after the launch: “her.”
It wasn’t subtle. It wasn’t innocent. It was a nod. a message, trigger. And likely, a legal mistake.
She wasn’t alone.
In the U.S., voice actors have filed a lawsuit against Lovo.ai. Their claim? The company used their voices without consent to train AI. In China, a company was already convicted in a similar case. The message is clear: this is no longer an isolated issue.
This Isn’t Just About Voice
The unauthorised use of voice is part of a wider pattern. Across the entire creative industry, AI companies are scraping, cloning and repurposing work without consent.
Writers, for example, have reported AI tools mimicking their tone, structure, and ideas. Visual artists have sued over models trained on copyrighted artwork. Even Disney is pushing back against Midjourney for replicating its distinct visual style.
And did you know news organisations like the BBC and the New York Post have raised legal concerns after AI companies lifted their content for training purposes?
All these cases are linked, and the solution should be simple. Public access does not mean public ownership.
But how do we fight for that right?
Creators of all kinds are seeing their work exploited in systems that offer no credit, no payment and no control. Where does it stop?
VoiceProductions Responds
At VoiceProductions, hundreds of professional voice actors work on a wide variety of productions every day. From commercials and e-learning to documentaries and audiobooks. Their voices are more than just sound. They reflect training, timing, and expression. A voice represents its creator and, by extension, that creator’s rights.
And yet, we increasingly see voices being cloned using AI, without the speaker’s permission. This practice is not only legally risky, it goes against fundamental principles of ownership and control.
That’s why our platform is built on clear and explicit agreements. Every recording is delivered with documented consent and detailed usage terms. Clients know exactly what they are getting, and voice actors retain control over their work.
Technology can do a lot, but it can’t do everything. Where a human voice persuades with nuance, pace, and intent, an AI imitation remains exactly that, an imitation. Even if that gap were ever to close, one thing remains non-negotiable: consent. Not as a box to tick, but as the foundation of ethical voice production.
VoiceProductions stands for transparency and respect. For clients, and for creators. The future of voice technology is full of potential, but only if it’s built on solid agreements and genuine respect for the people behind the voice.
And they matter. Always.
Jimmy Verrijt - Founder and project manager VoiceProductions
LinkedIn
Loïc Thaler - Project manager VoiceProductions
LinkedIn