Survey reveals that AI-based voice cloning service provider has insufficient spoofing measures

In recent years, with the development of generative AI, it has become possible to imitate human voices from audio files of just a few seconds, and services that create clone voices using AI have also appeared. According to
Consumer Reports' Assessment of AI Voice Cloning Products - Consumer Reports
https://www.consumerreports.org/media-room/press-releases/2025/03/consumer-reports-assessment-of-ai-voice-cloning-products/

AI can steal your voice, and there's not much you can do about it
https://www.nbcnews.com/tech/security/ai-voice-cloning-software-flimsy-guardrails-report-finds-rcna195131
Consumer Reports calls out poor AI voice-cloning safeguards • The Register
https://www.theregister.com/2025/03/10/ai_voice_cloning_safeguards/
Many AI voice cloning services can create artificial voice clones using only short audio clips of a particular individual speaking. These voice clones have many legitimate uses, including speeding up audio editing, aiding in movie dubbing, automating voiceovers, and helping people who have lost their voice to speak in their own voice.
However, without proper safeguards, malicious actors could use third-party audio files to create their own clones and use them to commit fraud, promote dubious products, or make derogatory comments about individuals.
There have already been several reported cases of AI-created voice cloning being misused, including one in 2023 in which an AI tool was used to read Adolf Hitler's book ' Mein Kampf ' in the voice of Emma Watson. In 2024, a teacher was arrested after using AI to create and distribute audio of his high school principal making racist and anti-Semitic comments in order to expel him.
A physical education teacher who tried to expel a school principal using AI-synthesized voice was arrested - GIGAZINE

On March 10, 2025, Consumer Reports released the results of a survey of six companies that provide AI-based voice cloning services: Descript , ElevenLabs , Lovo , PlayHT , Resemble AI , and Speechify , to see what measures they take to prevent abuse and misuse of their services.
The investigation found that four of the six companies in question - ElevenLabs, Speechify, PlayHT, and Lovo - had no technical mechanisms in place to prevent users from cloning voices of others. Although the companies confirmed that users had the legal right to create a cloned voice, users could simply check a box to create a cloned voice, regardless of their actual right.
Resemble AI also requires users to record audio in real time, rather than just uploading an audio file, to create a clone. This is one way to prevent someone from creating a clone, but Consumer Reports reported that they were able to get around this restriction by playing back audio recorded on another device.
The remaining Descript required the user to 'read the consent form' before creating a clone voice, which is somewhat effective as a protection against abuse unless the target's clone voice has already been created on another service.
Consumer Reports also pointed out that the ease with which users could set up an account also lowers the barrier to abuse. Four of the six companies - Speechify, Lovo, PlayHT, and Descript - allowed users to set up an account with just their name and email address. Meanwhile, ElevenLabs and Resemble AI required credit card and payment information to set up an account.
Consumer Reports is calling on companies offering these AI-powered voice cloning services to step up measures to protect consumers from the risks of their services. Specific recommendations from Consumer Reports include:
- Verify that the person whose voice is to be cloned has given their consent, for example by asking them to upload an audio recording of a specified sentence being read aloud.
- Collect users' credit card information, etc. to identify users who are abusing cloned voices.
- A watermark will be placed in the AI-generated clone voice to make it detectable.
- Create tools that can detect if audio was generated by your service.
- Detect and prevent voice cloning of influential people such as celebrities and politicians.
- Take systematic steps to flag phrases used in scams, sexual content, etc. and prevent them from being generated.
- The scope of provision of the voice clone creation service will be limited to specific users and companies, and the responsibility for any misuse will be specified in the contract.
'AI voice cloning tools could cause a surge in identity theft fraud,' said Grace Geddie, policy analyst at Consumer Reports. 'Our assessment shows that there are basic steps companies can take to make it harder to create a voice clone without your knowledge. But some companies aren't following them. We call on companies to raise the bar to prevent abuse and urge state attorneys general and the federal government to enforce existing consumer protection laws and consider whether new rules are needed.'

Related Posts:
in Software, Web Service, Security, Posted by log1h_ik