Search

Alert Grandma Avoids Losing Thousands in AI Voice-Cloning Scam

Alert Grandma Avoids Losing Thousands in AI Voice-Cloning Scam

A San Diego grandmother almost lost thousands of dollars to an AI-generated voice that sounded like her beloved grandson, according to the ABC affiliate media outlet 10News.

Maureen received a phone call from an anonymous number recently. Thinking that it was her sister, who often calls with a hidden number, the North County grandma picked up the phone, only to be met with a voice eerily similar to her grandson’s in distress.

“The voice on the other end said, ‘Grandma, grandma. I’m in trouble. I need your help,’” recalls Maureen, who prefers her surname is not mentioned for privacy reasons.

Also read: How to Spot AI Deepfakes on Social Media

Falling for the ‘scary’ AI con

The supposed grandson tells Maureen that he has been injured in a car accident, had a neck brace on, and was headed to the police station. The AI voice-cloned grandson, whose name is not revealed, said he needed money for bail, $8,200 in total.

“It sounded exactly like him [grandson] or else I would have never believed it. I have no doubt in my mind that it was him. That’s the scary part,” Maureen said.

To make the case strong, believeable, and scarier, an alleged lawyer jumped on the call and told Maureen her grandson had hit a diplomat in the accident and was in hot soup. The ‘lawyer’ warned her against telling anyone of what happened within 72 hours.

Grandma Maureen fell for the con, one made even more convincing by artificial intelligence technology. “Scared to death” for her grandson, she scrambled to put together the supposed ‘bail’ money and dashed to the bank to get some more.

But Maureen was alert. Wisely, before handing over her hard-earned cash, she first called her daughter to confirm her grandson’s well-being. Maureen heard that her real grandson was, as a matter of fact, very well, attending a golf tournament. The scammer goes into fits of rage.

On a second call that was answered by Maureen’s daughter, the cybercriminal “proceeded to call her all kinds of horrible, ugly names,” said the North County grandmother.

It is unclear where the scammers got Maureen’s grandson’s voice. But AI scams of this nature require a short sample of audio, often scrapped from social media platforms such as YouTube, podcasts, commercials, TikTok, Instagram, or Facebook videos, experts say.

KGTV (ABC News)

Growing and troubling trend

The impostor scam typically involves a scammer who impersonates a trusted person—a friend, lover, or, as in Maureen’s case, a grandma—and convinces the victim to send them money because they’re in some form of urgent trouble.

Colloquially known as the ‘grandma scam’, the trick has fooled many people, often the elderly, in many parts of the United States and elsewhere around the world. It is a growing trend that is becoming a major worry for law enforcement. Artificial intelligence is making it worse.

AI is helping bad actors imitate voices more easily and cheaply. AI tools like ElevenLabs and Stable Diffusion can manipulate voices and mouth movements, making it easy for people to believe that a video or audio recording is authentic.

In 2022, impostor scams were the second most popular racket in the U.S., with over 36,000 reports of people being swindled by thieves pretending to be friends and family, according to data from the Federal Trade Commission.

More than 5,100 of those incidents happened over the phone, accounting for over $11 million in losses, FTC officials said, The Washington Post reported. Maureen, the grandma from San Diego, said that her family has come up with ways to keep them safe from AI scams.

“Our family has what’s called a safe word, and it should be a word that nobody else would know. Don’t send it via text or email. Call them directly on the phone,” Maureen said.

“It was so terrifying emotionally. I don’t want anybody to have to go through this,” she added, per the 10News report.

Image credits: Shutterstock, CC images, Midjourney, Unsplash.

Welcome

Install
×