It was an average afternoon when an unknown number popped up on Frank’s phone.

“I typically don’t pick up an unknown number but for some reason I did and it was the voice of my daughter,” he recalled. “And she said, ‘Dad, I’m in jail.”

Frank, who chose not to share his last name, said he instantly recognized the voice as his daughter.

“It was so real. I mean it was unbelievably real. It was my daughter’s voice,” he said.

The caller told him she had got into a car accident involving a pregnant woman and needed bail money.

“We’re on a retired income and I’m like, ‘Where am I going to come up with $12,500?’ So, I’m in a panic,” Frank said.

He was so panicked that he drove to the bank to pull out the cash. He said he was just about to get the money when he heard from his daughter.

“Then it hit me. It was a scam,” Frank said.

While it was his daughter’s voice he heard, it wasn’t actually her; it was part of a scheme by someone using artificial intelligence voice cloning.

“It was very traumatizing because you never expect to hear something like that in a phone call,” Frank said.

Officials have started warning consumers that calls like this are increasing as the technology becomes easier to use. The Federal Trade Commission (FTC) issued an alert earlier this year warning that scammers are able to clone people’s voices using short audio clips taken from videos posted online. The Better Business Bureau (BBB) Scamtracker reports more than 200 complaints related to AI this year. The BBB published it’s own advice to help consumers distinguish what is real from what is fake.

Macfee, a global computer software company, conducted a survey this year and found people have lost between $500 to $15,000 on AI voice cloning attacks.

“These messages are the latest examples of targeted “spear phishing” attacks, which target specific people with specific information that seems just credible enough to act on it. Cybercriminals will often source this information from public social media profiles and other places online where people post about themselves, their families, their travels, and so on—and then attempt to cash in,” Macfee reported.

Margrit Betke, is one of the leaders at Boston University’s Artificial Intelligence Research initiative.

Her research has involved using AI to help people with motion disabilities but she is concerned that as the technology becomes more readily available, its misuse will increase.

“It can be very dangerous because you could make people say outrageous things and people might believe it,” she said.

Betke said in addition to people using the technology to take money from people, she is also worried about its use on social media and during political campaigns.

“It’s so easy now to switch around from legitimate uses of AI to something you shouldn’t be doing just because you can create these embeddings so easily.” Betke said.

She said in the past people needed hours of someone’s voice before they were able to clone it, now only seconds are required. Less work also means the technology is easier for anyone to use.

“I think it’s really worrisome that it’s so easy now for anybody to just generate videos and voices of existing people,” Betke said.

She believes there needs to be regulations to better monitor the use of voice cloning and other AI technology. President Biden recently issued an executive order aimed at increasing how AI is monitored and regulated. Part of the order includes establishing guidance for adding watermarks to AI generated content.

Frank said he reported his case to the local police but since he luckily didn’t lose any money there wasn’t anything they could do.

He said his family is still shaken up from the incident and created a protocol going forward so they can quickly verify if the person on the other end is actually one of their family members.

The FTC also recommends people immediately call the person who claims to be in danger to verify the situation. Another piece of advice is for families to create a code word to use if there really is an emergency so they can detect if it is actually their family member. Experts also recommend you make your social media private so people can’t pull audio of you from your online profiles and use it against you.

(Copyright (c) 2023 Sunbeam Television. All Rights Reserved. This material may not be published, broadcast, rewritten, or redistributed.)

Join our Newsletter for the latest news right to your inbox