About Us Contact Us Privacy Policy Terms of Use DMCA Opt-out of personalized ads
© Copyright 2023 Market Realist. Market Realist is a registered trademark. All Rights Reserved. People may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.

Texas Man Shares How His Father Lost $1000 In AI Voice Scam

The scammer allegedly used artificial intelligence to replicate his grandson's voice and pleaded for money and help. 
UPDATED JAN 22, 2024
Cover Image Source: Pexels | Mati Mango
Cover Image Source: Pexels | Mati Mango

A man from Texas came forward to warn about the dark side of artificial intelligence (AI) after his father lost $1,000 in a scam. 

"When you're in anxiety and panic, when you're there to help your family, I think logic gets thrown out the window,” Lee Hall told Good Morning America. "That's what the scammers are betting on, so now we have to raise our level of sophistication so these things won't happen to our loved ones."

Hall said his father received a distress call that he thought was from his grandso and sent $1,000 to the alleged scammer.

"He told his grandfather he was vacationing in Mexico with his friends. Got in trouble. Scared to death. He got into a little bit of drinking and got into an accident and now that's the reason why he's in a little bit of trouble and he needs some money to get him out of that situation," Hall said of the call his father received believing it was from his grandson.

The scammer allegedly used artificial intelligence to replicate his grandson's voice and pleaded for money and help. 

"That story is very believable," Hall said. "What's scary is my father is totally off the grid. He doesn't have any social media accounts. He doesn't even have an email. But the fact that they still connected my son, that makes it even scarier."

He said that the only way he could tell that it was a scam was because they knew that Christian was indeed okay. "We knew he was in Dallas, so we called him right away and he said, ‘Yes, Mom, I'm in college in Dallas,’ so we knew it was a scam."

In a similar incident, Jennifer DeStefano, a mother in Arizona, told GMA that she received an alarming call which turned out to be an AI voice cloning scam. 

AI and Its Notoriety Surrounding Cyber Security

Pexels | ThisIsEngineering
Pexels | ThisIsEngineering

While AI users can count its several advantages, the cyber security threats are a bigger concern. The annual cyber losses are complex to estimate, however, according to the Centre for International Governance Innovation (CIGI),  the International Monetary Fund places them in the range of US$100-$250 billion a year for the global financial sector.

It is proven that AI can be used to launch more effective and sophisticated attacks that can be very difficult to recognize. Experts are looking trying to develop a strong cybersecurity model using AI. AI models can be effectively used to automatically generate new security controls or even identify new vulnerabilities and work to counter them in fruitful ways. 

AI Can Easily Replicate Your Voice

Pexels | cottonbro studio
Pexels | cottonbro studio

According to The Guardian's investigation, voice recognition systems which are known as "voiceprint" security used by Centrelink and the Australian Tax Office, can be easily fooled by AI. Dr. Lisa Given, who is a professor at RMIT Univerisity, believes that AI can easily fool a person into thinking that they are talking to someone they know.

"When a system can reasonably copy my voice and also add in empathy, you could imagine that a scammer could move from sending a text that says, 'Hey mum, I've lost my phone,' to making a phone call or sending a voicemail that was actually attempting to create that person's voice," she said, as per ABC

Measures One Can Take To Protect Themselves

Pexels | Tima Miroshnichenko
Pexels | Tima Miroshnichenko

The US Federal Trade Commission recently warned consumers about fake family emergency calls. They highlighted a few steps one can take to protect themselves from cloning. They urged everybody to call friends and family directly or even come up with a safe word to say over the phone in case of explaining an emergency. The commission also advised everyone to be very wary of unexpected calls and know that even ID numbers can be faked.

The last tip is pretty much a no-brainer. The commission urged everyone to never share personal identifying information such as your birth date, address, or middle name.