FAKE VOICES
Artificial Intelligence allows bad guys to learn someone's voice and vocal patterns and then manipulate it to scam people.
Thomas Brewster has said "Once a technology confined to the realm of fictional capers like Mission: Impossible, voice cloning is now widely available."
The important thing to always be aware of is: if you get a phone call from someone you know who seems to be in an emergency situation and is pleading for money, it might
be a scam with a faked voice.
These scams are still too new to have an official name yet. I have seen it referred to with all these terms:
- Voice fraud
- Voice phishing or the shortened version: vishing
- Voice cloning
- Voice swapping
- Artificial voice
- AI voice cloning and A.I.-generated audio
- Synthetic Audio and Deepfake Audio and Audio Deepfakes
- Deep Voice, and the generic, DeepFake
- Family-emergency schemes
IMPROVING TECHNOLOGY
April 27, 2023: It's Time to Protect Yourself From AI Voice Scams
by Caroline Mimbs Nyce in The Atlantic. Sub-head: Anyone can create a convincing clone of a stranger’s voice. What now? This is a worthwhile read, it takes a measured look at where things currently stand. Quoting: "[voice scams] have existed for some time ... but they’ve gotten better, cheaper, and more accessible in the past several months alongside a generative-AI boom. Now anyone with a dollar, a few minutes, and an Internet connection can synthesize a stranger’s voice."
A five minute video from CNN: CNN's Donie O'Sullivan tests AI voice-mimicking software March 2023.
April 28, 2023: I Cloned Myself With AI. She Fooled My Bank and My Family. by Joanna Stern in the Wall Street Journal. Stern replaced herself with an AI voice and video to see how humanlike the tech can be. The results were eerie. She tested Synthesia, a tool that creates artificially intelligent avatars from recorded video and audio (aka deepfakes). Also tested a voice clone generated by ElevenLabs.
The ElevenLabs voice fooled her Chase credit card’s voice biometric system. One of the systems also fooler her relatives.
February 23, 2023: How I Broke Into a Bank Account With an AI-Generated Voice by Joseph Cox for Vice.
Banks in the U.S. and Europe tout voice ID as a secure way to log into your account. Cox proved it is possible to trick such systems with free or cheap AI-generated voices that are widely available. Cox used a free voice creation service from ElevenLabs, an AI-voice company. TD Bank, Chase and Wells Fargo did not respond to a request for comment. Likewise,
ElevenLabs did not respond to multiple requests for comment.
January 30, 2023: AI-Generated Voice Firm Clamps Down After 4chan Makes Celebrity Voices for Abuse by Joseph Cox for Vice. 4chan members have used AI software to generate voices that sound like Joe Rogan, Ben Shapiro, and Emma Watson to spew racist material. For example, a fake Emma Watson reads a section of Mein Kampf. ElevenLabs says it can generate a clone of someone’s voice from a clean sample recording, over one minute long. The high quality of the fake voices, and the ease which which people create them, highlight the looming risk of deepfake audio clips.
January 9, 2023. It is bad enough that, when we get a phone call, the callerid can be faked. Now, faking the voice is getting better and easier. Microsoft's new AI can simulate anyone’s voice with 3 seconds of audio by Benj Edwards for Ars Technica. It claimed that this new system can preserve the speaker's emotional tone and acoustic environment. Quoting: "Microsoft researchers announced a new text-to-speech AI model called VALL-E that can closely simulate a person's voice when given a three-second audio sample ... VALL-E can synthesize audio of that person saying anything - and do it in a way that attempts to preserve the speaker's emotional tone. Unlike other text-to-speech methods ... VALL-E generates discrete audio codec codes from text and acoustic prompts. It basically analyzes how a person sounds, breaks that information into discrete components (called 'tokens') thanks to EnCodec, and uses training data to match what it 'knows' about how that voice would sound if it spoke other phrases outside of the three-second sample.".
SCAMS DONE WITH FAKE VOICES
- April 29, 2023: 'Mom, these bad men have me': She believes scammers cloned her daughter’s voice in a fake kidnapping by Faith Karimi for CNN. Another case of a mother being fooled into thinking that the voice on the phone was her daughter. Quoting:
"Imposter scams have been around for years. Sometimes, the caller reaches out to grandparents and says their grandchild has been in an accident and needs money. Fake kidnappers have used generic recordings of people screaming. But federal officials warn such schemes are getting more sophisticated, and that some recent ones have one thing in common: cloned voices."
- March 5, 2023: They thought loved ones were calling for help. It was an AI scam. by Pranshu Verma for the Washington Post. To fake the voice of a person used to require a large voice sample. No more. Bad guys can now replicate a voice with an audio sample of a few sentences. The audio could come from YouTube, TikTok, Instagram, Facebook videos or podcasts, making many people vulnerable. Or rather, making their relatives vulnerable to scammers. The technology to do this is now much easier to use and cheaper making it available to more scammers.
- October 2021: A bank manager in Hong Kong received a call from a man whose voice he thought he recognized and the bank was scammed out of $35 million dollars. Fraudsters Cloned Company Director's Voice In $35 Million Bank Heist, Police Find by Thomas Brewster in Forbes. Manipulating audio is easier to orchestrate than deep fake videos, so expect more in the future.
- July 2021: A documentary about Anthony Bourdain includes three scenes with a fake voice. The director admitted to one scene, no one knows what the other two are.
The Ethics of a Deepfake Anthony Bourdain Voice
by Helen Rosner for The New Yorker
- July 2020: Listen to This Deepfake Audio Impersonating a CEO in Brazen Fraud Attempt by Lorenzo Franceschi-Bicchierai of Vice. A security firm analyzed a suspicious voicemail left to a tech company employee, part of an attempt to get the employee to send money to criminals.
- August 2019: Fake voices help cyber-crooks steal cash (BBC July 2019). Symantec has seen three cases of faked audio of chief executives used to trick financial controllers into transferring cash. Fraudsters deepfake CEO’s voice to trick manager into transferring $243,000. The original story was reported in the Wall Street Journal.
- July 2019: Deepfake Audio Used to Impersonate Senior Executives (CPO Magazine). The attacks seen so far have used background noise to mask imperfections, for example simulating someone calling from a spotty cellular phone connection or being in a busy area with a lot of traffic.
- May 19, 2017: BBC fools HSBC voice recognition security system by Dan Simmons. HSBC introduced voice-based security in 2016. Reporter Dan Simmons has a non-identical twin who was able to access the account via the telephone after he mimicked his brother's voice. The brother made seven failed attempts before he got in. "Separately, a Click researcher found HSBC Voice ID kept letting them try to access their account after they deliberately failed on 20 separate occasions spread over 12 minutes."
VOICE SCAM DEFENSES
The biggest defense is to be aware that this sort of thing exists. Calls to older adults by younger relatives asking for money for an emergency is a known scam. Likewise, calls from your boss asking you to transfer money to a new bank account are suspicious.
- Be aware that the callerid on a phone call can be spoofed.
- Be aware that a voice can be cloned using audio clips from social media posts.
- Kidnapping scams need to know where the victim is supposed to be to make the scam believable. To counter this, the FBI says to not post information about upcoming trips on social media. Especially don't mention airplane trips because someone on a plane can not be called to verify that they are safe.
- When a loved one calls asking for money, put the call on hold and call them back. Or, have someone else call them back. Or call someone who can verify where the supposed victim really is.
- Verify the identity of the caller. If it is a kidnapping scam, verify the identity of the kidnapping victim.
- Ask them a question that only they would know. The problem with this is that if you believe the emergency is real, you may not think clearly.
- Setup a security phrase or code word with your loved ones to verify identities on a phone call. For example, emergency requests for money or sensitive information should include the term "hockey pucks".
- "Scammers ask you to pay or send money in ways that make it hard to get your money back. If the caller says to wire money, send cryptocurrency, or buy gift cards and give them the card numbers and PINs, those could be signs of a scam." From this FTC Consumer Alert: Scammers use AI to enhance their family emergency schemes by Alvaro Puig March 20, 2023.
- Defending against audio deepfakes before it's too late (Axios April 2019). A review of the state of the art both for creating and detecting fake audio.
- How To Spot Deepfake Audio Fraud (Aug. 2019). The quality of the fake voice can be excellent for non-conversational audio, such as a statement. However, it suffers when engaged in a conversation.
- May 4, 2023: Voice Cloning Makes Virtual Kidnapping More Convincing by Larry Magid. The article includes defensive advice.
CONSUMER USES of FAKE VOICES and AI
- January 5, 2023. Researcher Deepfakes His Voice, Uses AI to Demand Refund From Wells Fargo by Joseph Cox for Wired. Why stay on the line with the bank for ten, 20, 30 minutes, when an AI-driven bot could waste that time for you? That’s what a new tool from Do Not Pay promises. The founder of Do Not Pay used an AI-generated version of his own voice to overturn wire fees. Do Not Pay, which has automated a ton of menial tasks, says it plans to make the tool available to customers.
- December 20, 2022. ChatGPT Can Negotiate Comcast Bills Down For You by Edward Ongweso Jr for Vice.
DoNotPay styles itself as a consumer advocate, primarily using templates to help users secure refunds from corporations. There are, however, sharp limits to the viability of that model, so the company has recently been experimenting with AI. Quoting: "Now we can really have conversations with companies and that's dramatically increased our success rate and allowed us to pursue much higher levels of disputes. Now we can negotiate hospital bills, lower utility bills, things where the companies respond and we can chat with them in real time .. That's the future of bureaucracy: bots negotiating with each other..." said Joshua Browder, CEO of DoNotPay.