Tech »  Topic »  Popular Voice Cloning Tools Lack Safeguards

Popular Voice Cloning Tools Lack Safeguards


Abuse Can Lead to Fraud, Impersonation Scams Rashmi Ramesh (rashmiramesh_) • March 11, 2025

Image: Shutterstock

Need a new voice? Artificial intelligence has you covered. Need to protect your own? That's another story. Some of the most widely used AI voice synthesis tools offer only superficial safeguards against misuse - if any at all, researchers found in a recent analysis.

See Also: Top Three Cyber Predictions for 2025

Popular AI-powered voice cloning tools lack sufficient safeguards to prevent misuse, a Consumer Reports study found, raising concerns over the potential for fraud and impersonation scams. The study assessed voice cloning products from six companies: Descript, ElevenLabs, Lovo, PlayHT, Resemble AI and Speechify, and found that four out of the six providers offer minimal or no meaningful security measures against abuse.

ElevenLabs, Speechify, PlayHT and Lovo require only basic user self-attestation, such as checking a box confirming that they have the legal right ...


Copyright of this story solely belongs to bankinfosecurity . To see the full text click HERE