FAKE VOICES
Artificial Intelligence allows bad guys to learn someone's voice and vocal patterns and then manipulate it to scam people.
Thomas Brewster has said "Once a technology confined to the realm of fictional capers like Mission: Impossible, voice cloning is now widely available."
The important thing to always be aware of is: if you get a phone call from someone you know who seems to be in an emergency situation and is pleading for money,
it might be a scam with a faked voice.
These scams do not yet have an official name. I have seen it referred to with all these terms:
- Voice fraud
- Voice phishing or the shortened version: vishing
- Voice cloning and Vocal Cloning
- Voice swapping
- Artificial voice
- AI voice cloning and AI-generated audio
- Synthetic Audio and Deepfake Audio and Audio Deepfakes
- Deep Voice, and the generic, DeepFake
- Family-emergency schemes
- Impersonation scams
FYI
- December 6, 2024: Your AI clone could target your family, but there's a
simple defense by Benj Edwards for Ars Technica. Quoting: "On Tuesday, the US Federal Bureau of Investigation advised Americans to share a secret word or phrase with their family members to protect against AI-powered voice-cloning scams, as criminals increasingly use voice synthesis to impersonate loved ones in crisis."
A good overview article.
- What To Do if You Were Scammed from the Federal Trade Commission July 2022.
IMPROVING TECHNOLOGY
- December 3, 2024: Criminals Use Generative Artificial Intelligence to Facilitate Financial Fraud from the FBI.
Alert Number: I-120324-PSA. Quoting: "Generative AI reduces the time and effort criminals must expend to deceive their targets. Generative AI takes what it has learned from examples input by a user and synthesizes something entirely new based on that information. These tools assist with content creation and can correct for human errors that might otherwise serve as warning signs of fraud." This has an overview of many different approaches used by bad guys in scams, and, some defensive steps.
- April 27, 2023: It's Time to Protect Yourself From AI Voice Scams by Caroline Mimbs Nyce in The Atlantic. Sub-head: Anyone can create a convincing clone of a stranger’s voice. What now? This is a worthwhile read, it takes a measured look at where things currently stand. Quoting: "[voice scams] have existed for some time ... but they’ve gotten better, cheaper, and more accessible in the past several months alongside a generative-AI boom. Now anyone with a dollar, a few minutes, and an Internet connection can synthesize a stranger’s voice."
- A five minute video from CNN: CNN's Donie O'Sullivan tests AI voice-mimicking software March 2023.
- April 28, 2023: I Cloned Myself With AI. She Fooled My Bank and My Family. by Joanna Stern in the Wall Street Journal. Stern replaced herself with an AI voice and video to see how humanlike the tech can be. The results were eerie. She tested Synthesia, a tool that creates artificially intelligent avatars from recorded video and audio (aka deepfakes). Also tested a voice clone generated by ElevenLabs. The ElevenLabs voice fooled her Chase credit card’s voice biometric system. One of the systems also fooler her relatives.
- February 23, 2023: How I Broke Into a Bank Account With an AI-Generated Voice by Joseph Cox for Vice. Banks in the U.S. and Europe tout voice ID as a secure way to log into your account. Cox proved it is possible to trick such systems with free or cheap AI-generated voices that are widely available. Cox used a free voice creation service from ElevenLabs, an AI-voice company. TD Bank, Chase and Wells Fargo did not respond to a request for comment. Likewise, ElevenLabs did not respond to multiple requests for comment.
- January 30, 2023: AI-Generated Voice Firm Clamps Down After 4chan Makes Celebrity Voices for Abuse by Joseph Cox for Vice. 4chan members have used AI software to generate voices that sound like Joe Rogan, Ben Shapiro, and Emma Watson to spew racist material. For example, a fake Emma Watson reads a section of Mein Kampf. ElevenLabs says it can generate a clone of someone’s voice from a clean sample recording, over one minute long. The high quality of the fake voices, and the ease which which people create them, highlight the looming risk of deepfake audio clips.
- January 9, 2023. It is bad enough that, when we get a phone call, the callerid can be faked. Now, faking the voice is getting better and easier. Microsoft's new AI can simulate anyone’s voice with 3 seconds of audio by Benj Edwards for Ars Technica. It claimed that this new system can preserve the speaker's emotional tone and acoustic environment. Quoting: "Microsoft researchers announced a new text-to-speech AI model called VALL-E that can closely simulate a person's voice when given a three-second audio sample ... VALL-E can synthesize audio of that person saying anything - and do it in a way that attempts to preserve the speaker's emotional tone. Unlike other text-to-speech methods ... VALL-E generates discrete audio codec codes from text and acoustic prompts. It basically analyzes how a person sounds, breaks that information into discrete components (called 'tokens') thanks to EnCodec, and uses training data to match what it 'knows' about how that voice would sound if it spoke other phrases outside of the three-second sample.".
SCAMS DONE WITH FAKE VOICES
- July 27, 2024: Ferrari exec foils deepfake attempt by asking the scammer a question only CEO Benedetto Vigna could answer by Daniele Lepido and Bloomberg for Fortune. A Ferrari executive first got a bunch of unexpected messages, seemingly from the CEO. Then a scam phone call. The voice impersonating the CEO was convincing, but the exec posed a question that only the real CEO would know. End of phone call. Quoting: "While these generative AI tools can create convincing deepfake images, videos and recordings, they’ve not yet proved convincing enough to cause the widespread deception that many have warned about."
- July 5, 2024: Fake voice scams have gone Hollywood - they made a movie about it. See: The voice scam call portrayed in 'Thelma' is real and an increasing threat in the age
of AI by Genna Contino for CNBC. The focus here is on Senior Citizens who may need extra protection from scams. For that, experts recommend freezing
their credit and establishing a financial surrogate.
- April 12, 2024: Scammers Target LastPass Employee With CEO Audio Deepfake by Michael Kan for PC Magazine. Password manager LastPass is warning the public to be on guard against AI-generated scam calls pretending to be your boss,
after one such deepfake audio call targeted a company employee. The scam failed. It used a WhatsApp audio call and impersonated the LastPass CEO Karim Toubba.
- March 7, 2024: From The New Yorker Magazine: The Terrifying A.I. Scam That Uses Your Loved One’s Voice by Charles Bethea. Long article that describes multiple instances of AI generated fake voices of relatives used for scams. On the tech side, it notes that a big breakthrough came in 2022 when: "... ElevenLabs unveiled a service that produced impressive clones of virtually any voice quickly; breathing sounds had been incorporated, and more than two dozen languages could be cloned. ElevenLabs’s technology is now widely available. 'You can just navigate to an app, pay five dollars a month, feed it forty-five seconds of someone's voice, and then clone that voice,' ... The company is now valued at more than a billion dollars, and the rest of Big Tech is chasing closely behind." The Microsoft Vall-E cloning program needs just a 3 second sample to replicate a voice. It is not available to the public. AI generated voices are good enough to eliminate sub-titles and dubbing in movies. It allows actors to appear to speak other languages. "In January, voters in New Hampshire received a robocall call from Joe Biden’s voice telling them not to vote in the primary." As to defense, there is not much here just that wiring money, sending cryptocurrency and buying gift cards are common indicators of a scam. One example in the article used Venmo to pay the bad guys. One family, after being victimized, chose a family password, but someone forgot it.
- January 25, 2024: Preparing For The Age Of AI Scams from NPR. A 33 minute audio of a radio show.
- December 6, 2023: A father is warning others about a new AI family emergency scam by Andrea Blanco for the Independent. Nothing much new here. A couple parents testified before Congress about being scammed by fake voices of their children. One scam happened in 2020. Creating a fake voice only requires a very short sample, one that might come from YouTube or social media. In the worst case, a 3 second video sample can be used to create a fake voice.
- April 29, 2023: 'Mom, these bad men have me': She believes scammers cloned her daughter’s voice in a fake kidnapping by Faith Karimi for CNN. Another case of a mother being fooled into thinking that the voice on the phone was her daughter. Quoting:
"Imposter scams have been around for years. Sometimes, the caller reaches out to grandparents and says their grandchild has been in an accident and needs money. Fake kidnappers have used generic recordings of people screaming. But federal officials warn such schemes are getting more sophisticated, and that some recent ones have one thing in common: cloned voices."
- March 5, 2023: They thought loved ones were calling for help. It was an AI scam. by Pranshu Verma for the Washington Post. To fake the voice of a person used to require a large voice sample. No more. Bad guys can now replicate a voice with an audio sample of a few sentences. The audio could come from YouTube, TikTok, Instagram, Facebook videos or podcasts, making many people vulnerable. Or rather, making their relatives vulnerable to scammers. The technology to do this is now much easier to use and cheaper making it available to more scammers.
- October 2021: A bank manager in Hong Kong received a call from a man whose voice he thought he recognized and the bank was scammed out of $35 million dollars. Fraudsters Cloned Company Director's Voice In $35 Million Bank Heist, Police Find by Thomas Brewster in Forbes. Manipulating audio is easier to orchestrate than deep fake videos, so expect more in the future.
- July 2021: A documentary about Anthony Bourdain includes three scenes with a fake voice. The director admitted to one scene, no one knows what the other two are.
The Ethics of a Deepfake Anthony Bourdain Voice
by Helen Rosner for The New Yorker
- July 2020: Listen to This Deepfake Audio Impersonating a CEO in Brazen Fraud Attempt by Lorenzo Franceschi-Bicchierai of Vice. A security firm analyzed a suspicious voicemail left to a tech company employee, part of an attempt to get the employee to send money to criminals.
- August 2019: Fake voices help cyber-crooks steal cash (BBC July 2019). Symantec has seen three cases of faked audio of chief executives used to trick financial controllers into transferring cash. Fraudsters deepfake CEO’s voice to trick manager into transferring $243,000. The original story was reported in the Wall Street Journal.
- July 2019: Deepfake Audio Used to Impersonate Senior Executives (CPO Magazine). The attacks seen so far have used background noise to mask imperfections, for example simulating someone calling from a spotty cellular phone connection or being in a busy area with a lot of traffic.
- May 19, 2017: BBC fools HSBC voice recognition security system by Dan Simmons. HSBC introduced voice-based security in 2016. Reporter Dan Simmons has a non-identical twin who was able to access the account via the telephone after he mimicked his brother's voice. The brother made seven failed attempts before he got in. "Separately, a Click researcher found HSBC Voice ID kept letting them try to access their account after they deliberately failed on 20 separate occasions spread over 12 minutes."
VOICE SCAM DEFENSES
The first defense is to be aware that this sort of thing exists. Calls to older adults by younger relatives asking for money for an emergency is a well known scam. Likewise, calls from your boss asking you to transfer money to a new bank account are suspicious. Families should have a safe word (or safe phrase) that only they know. This can be used to verify that a voice really belongs to the person it sounds like.
- Be aware that the callerid on a phone call can be spoofed.
- Be aware that a voice can be cloned using audio clips from social media posts.
- Kidnapping scams need to know where the victim is supposed to be to make the scam believable. To counter this, the FBI says to not post information about upcoming trips on social media. Especially don't mention airplane trips because someone on a plane can not be called to verify that they are safe.
- When a loved one calls asking for money, put the call on hold and call them back. Or, have someone else call them back. Or call someone who can verify where the supposed victim really is.
- Verify the identity of the caller. If it is a kidnapping scam, verify the identity of the kidnapping victim.
- Ask them a question that only they would know. The problem with this is that if you believe the emergency is real, you may not think clearly.
- Setup a security phrase or code word with your loved ones to verify identities on a phone call. For example, emergency requests for money or sensitive information should include the term "hockey pucks".
- October 17, 2023: Should you have a family 'safe word' against AI voice-spoofing scams? by Shira Ovide in the Washington Post. The article offers background on the issue, including the helpful reminder that caller ID can be faked.
- Try to keep your personal information away from the bad guys. To that end, do what you can to improve the privacy of your social media, such that your personal
info is only visible to the people you want to share it with.
- If your credit reports are frozen, it can minimize the damage should you get scammed.
- Defensive article: You Need to Create a Secret Password With Your Family by Matt Burgess for Wired. December 25, 2024. The title explains most of the article, but it also goes into the limits of a family password/passphrase.
- Good defensive-focused article: Worried About AI Voice Clone Scams? Create a Family Password by Cooper Quintin of the EFF. January 31, 2024. Quoting: "The family password is a decades-old, low tech solution to this modern high tech problem.
... agree with your family on a password you can all remember and use ... it should be easy to remember in a panic, hard to forget, and not public information. You could use the name of a well known person or object in your family, an inside joke, a family meme ... this does not need to be limited to your family, it can be ... any group of people with which you associate..." Without a pre-chosen family password, the article suggests asking for the name of a pet or about the last time you saw the person who claims to be on the phone.
- "Scammers ask you to pay or send money in ways that make it hard to get your money back. If the caller says to wire money, send cryptocurrency, or buy gift cards and give them the card numbers and PINs, those could be signs of a scam." From this FTC Consumer Alert: Scammers use AI to enhance their family emergency schemes by Alvaro Puig March 20, 2023.
- The US Department of Homeland Security produced a 43 page PDF: Increasing Threat of DeepFake Identities. 43 pages, but no date, so I have no idea when it was written. There is defensive advice for detecting scam images and videos on page 33. For detecting fake audio, they suggest looking for/at:
• Choppy sentences
• Varying tone inflection in speech
• Phrasing – would the speaker say it that way?
• Context of message – Is it relevant to a recent discussion or can they answer related questions?
• Contextual clues - are background sounds consistent with the presumed location of the speaker?
- May 4, 2023: Voice Cloning Makes Virtual Kidnapping More Convincing by Larry Magid. The article includes defensive advice.
- Defending against audio deepfakes before it's too late by
Kaveh Waddell of Axios. April 3, 2019. A review of the state of the art both for creating and detecting fake audio.
- How To Spot Deepfake Audio Fraud by Lauren Sharkey for Bustle. August 21, 2019. The quality of the fake voice can be excellent for non-conversational audio, such as a statement. However, it suffers when engaged in a conversation.
CONSUMER USES of FAKE VOICES and AI
- January 5, 2023. Researcher Deepfakes His Voice, Uses AI to Demand Refund From Wells Fargo by Joseph Cox for Wired. Why stay on the line with the bank for ten, 20, 30 minutes, when an AI-driven bot could waste that time for you? That’s what a new tool from Do Not Pay promises. The founder of Do Not Pay used an AI-generated version of his own voice to overturn wire fees. Do Not Pay, which has automated a ton of menial tasks, says it plans to make the tool available to customers.
- December 20, 2022. ChatGPT Can Negotiate Comcast Bills Down For You by Edward Ongweso Jr for Vice.
DoNotPay styles itself as a consumer advocate, primarily using templates to help users secure refunds from corporations. There are, however, sharp limits to the viability of that model, so the company has recently been experimenting with AI. Quoting: "Now we can really have conversations with companies and that's dramatically increased our success rate and allowed us to pursue much higher levels of disputes. Now we can negotiate hospital bills, lower utility bills, things where the companies respond and we can chat with them in real time .. That's the future of bureaucracy: bots negotiating with each other..." said Joshua Browder, CEO of DoNotPay.