Scammers square

Scammers are Now Using AI to Try to Steal Your Money

08/23/2023 Written by: Kristine Simmons

If somebody called you from an unknown number, claiming to be from your bank, and asked you to "confirm" your sensitive account information, you would probably recognize right away that this was a scam and hang up. But, what if the voice on the other end of the line was a loved one? You recognize them as someone from your family. They sound upset as they describe an emergency and beg for your help. Who could be so cold-hearted as to refuse their request for money?

Unfortunately, this is a new kind of scam that has been using the power of Artificial Intelligence (AI) to fool tens of thousands of people into sending millions of dollars to online thieves. The Washington Post tells the story of a Canadian couple in their 70s who received a call from someone who sounded exactly like their grandson Brandon. He said he was in jail, with no wallet or cellphone, and needed cash for bail. “We were sucked in,” his grandmother said. “We were convinced that we were talking to Brandon.”

The couple dashed down to their bank and withdrew the daily maximum ($2,207 in U.S. currency). Then they hurried to a second branch for more money. But an alert bank manager pulled them aside and told them how another patron had gotten a similar call and learned that the eerily accurate voice had been faked. That’s when they realized they’d been duped.1

The Post reports that technology is making it easier and cheaper for bad actors to mimic voices, convincing people, often the elderly, that their loved ones are in distress. "In 2022, impostor scams were the second most popular racket in America, with over 36,000 reports of people being swindled by those pretending to be friends and family, according to data from the Federal Trade Commission. Over 5,100 of those incidents happened over the phone, accounting for over $11 million in losses, FTC officials said."

Advancements in AI technology now allow bad actors to replicate a voice with an audio sample of just a few sentences. Easily available online tools can then translate an audio file into a replica of a voice, allowing a scammer to make it “speak” whatever they type.

It's difficult for law enforcement to find and prosecute these thieves. And Vice reports that "the courts have not yet decided when or if companies will be held liable for harms caused by deepfake voice technology—or any of the other increasingly popular AI technology, like ChatGPT—where defamation and misinformation risks seem to be rising."2

One way to short-circuit this type of scam is to ask the "loved one" about something only he or she would know. But the surest way to confirm that the need is genuine is to contact family to find out if the loved one is the location and situation they claim. With scams like this on the rise, be highly vigilant of any unsolicited requests for money or personal information. We are happy to share resources to help you protect yourself from online thieves.



Investment Planning Basics Square
Investment Planning: The Basics
Financial Wellness05/01/2024

Why do so many people never obtain the financial independence that they desire? Often it’s because they just don’t take that first step—getting started. Besides procrastination, other excuses people...

Understand Economy Square
Understanding the Economy: It Depends Who is Doing the Measuring
Financial Wellness04/17/2024

You see one headline that says, "Inflation Finally Under Control," followed by one that reads, "Banks Wary of Further Inflation." A news outlet announces, "Unemployment Near Record Low." Yet on the...

View of money square
Keeping a Balanced View of Money
Financial Wellness04/10/2024

The drive to acquire money can warp your values. Having a lot of it can blind you to key realities about yourself. And when you're obsessed with it, it can act as the yardstick by which you measure...