Protecting against voice capture

Thinking of spam texts, I wondered about voice capture of voice mail recordings.

From Wired:

One misunderstanding is, ‘It cannot happen to me. No one can clone my voice,’” says Rahul Sood, chief product officer at Pindrop, a security company that discovered the likely origins of the AI Biden audio. “What people don’t realize is that with as little as five to 10 seconds of your voice, on a TikTok you might have created or a YouTube video from your professional life, that content can be easily used to create your clone.” Using AI tools, the outgoing voicemail message on your smartphone might even be enough to replicate your voice.

Short of an automated voice mail message (not always possible for businesses that need tailored voice mail recordings), is there anything else we can do?

1 Like

Use a generative AI to create audio based on somebody else’s voice (don’t use Scarlet Johansson!).