AI voice cloning scams on the rise, expert warns

Scammers are increasingly turning to artificial intelligence (AI) tools to clone the voices of individuals they target on social media to place panicked calls to their family or friends in the hope of convincing the unwitting recipient of the call to give them money or access to sensitive information.

Mike Scheumack, the chief innovation officer at identity theft protection and credit score monitoring firm IdentityIQ, told FOX Business that, “AI has been around for a long time and software companies have been using it to advance technology for a while. We’ve seen it start entering into this kind of cybercriminal space slowly, then all of the sudden just ramp up very quickly over the past year or so.”

“We’ve seen a lot in terms of advanced phishing scams, targeted phishing scams, we’ve seen where AI is being used to generate very specific emails and the language is very specific as to who the target is,” he added. “We’ve seen AI voice cloning scams increase over the past year as well, which is a very scary topic.”

Fraudsters carrying out voice cloning scams will record a person’s voice or find an audio clip on social media or elsewhere on the internet. “All they need is as little as 3 seconds, 10 seconds is even better to get a very realistic clone of your voice,” Scheumack explained. The audio sample is then run through an AI program that replicates the voice, allowing the scammer to make it say whatever they type in addition to adding laughter, fear, and other emotions into the cloned voice depending on how the scam is scripted.

GENERATIVE AI TOOLS LEAD TO RISING DEEPFAKE FRAUD

To demonstrate how sophisticated AI voice cloning programs are, IdentityIQ took an audio sample from an interview the author of this article did on the “Fox News Rundown” podcast this spring. They used that audio sample to create an AI voice clone of a panicked phone call to a family member requesting a cash app transfer following a fictitious car accident:

“Mom, I need to talk to you. I, I was going to interview someone today for a story I’m working on and I was involved in a car accident. I’m okay, but I need your help right now. I, I hit the other car’s bumper. They want $1,000 to cover the cost to repair the damage or they’ll call the police and report it to insurance. They need the money now, and I need your help. Can you please send me $1,000 over Zelle? I can tell you how to do it,” the voice clone said.

Scheumack noted the voice clone calls from scammers are typically shorter than this example and may try to cut off a potential conversation by saying something like, “I can’t talk right now,” as they relay the request for money, account access, or other sensitive information.

“The goal of the scammer is to get you into fight or flight and to create urgency in your mind that your loved one is in some sort of trouble. So the best way to deal with those situations is to hang up and immediately call your loved one to verify if it’s them or not,” he explained.

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

Scheumack cited a recent example of an interview IdentityIQ did with an individual who received what she believed to be a panicked call from her daughter who was at a camp, but it was actually an AI-generated voice clone of her daughter. The scammers had found a post the daughter made about going to camp on social media and utilized it to make the call more realistic.

Fraudsters carrying out AI voice scams are also using AI programs to search the internet for information about individuals and businesses, including audio or video posts on social media or elsewhere, for details that can be used to make more compelling calls to unwitting victims, Scheumack noted.

“The scary thing is, is that this is not your next-door neighbor doing this… This is a sophisticated organization, it’s not one person doing it. You have people that are researching on social media and gathering data about people. Those are not the same people that are going to plug in your voice. You have somebody else that’s going to clone the voice. You have somebody else that’s going to actually commit the act of calling. And you have somebody come to the victim’s house and pick up money if the scam is working.”

UPS USING AI TO PREVENT ‘PORCH PIRATES’ FROM STEALING PACKAGES

As far as steps individuals can take to ensure they don’t fall victim to an AI voice cloning scam, Scheumack said they should be aware of what they post online that’s available to the public and think twice before responding to an urgent call from an unknown number that’s ostensibly from someone you know.

“Be cautious about what you post online, that’s a first step,” Scheumack said. “The second step is if you do receive a phone call from an unknown number and it’s somebody that you love generally take caution with that – that should be a red flag to you if you’re receiving a call from an unknown number and it’s a relative or a loved one and there’s an urgent situation. You should definitely take a second to think about that.”

GET FOX BUSINESS ON THE GO BY CLICKING HERE

Scheumack recommended that families consider implementing some sort of password that’s prompted by the use of another phrase that can be used to verify that a caller citing some sort of emergency is indeed the family member they say they are.

   

Advertisements