FTC Warns Consumers About Voice Cloning AI Scams — How to Stay Safe
AI scams are continually evolving, and the FTC is warning consumers not to trust calls that could use voice cloning. Here's how to stay safe.
The number of ways in which consumers can be duped out of their money continues to evolve, so it's essential to stay updated on the most popular types of scams, including AI scams. A new version of the old "family member emergency scam" is on the rise, but you can be prepared and know how to spot the warning signs. How do these AI scams work?
Although technologies change, the primary goal of a scammer remains to get money from someone through dishonest means. Whether it's through Medicare robocalls, romance scammers, or fake text messages, scammers know how to make consumers act rashly and give out their money or personal information.
How do AI scams work?
The rise of artificial intelligence (AI) has been a source of concern for plenty of people, and one way AI can be used for illegal activity is through scams. The Federal Trade Commission, or FTC, has released a new warning to the public about AI scams using voice cloning to convince you it's a loved one calling you for help.
The specific type of AI scam the FTC explained is a variation on the "grandparent scam" in which an older person receives communication claiming to be a police officer or authority figure calling on behalf of their relative, saying they need money for bail or some other emergency. While this is bad enough, AI is making these scams more convincing.
AI scammers can now use voice cloning technology to create a copy of your loved one's voice. The scammer only needs a brief audio clip of the person's voice, which is often readily available online, and a voice cloning program. Anyone could call you, sound exactly like your grandchild or other relative, and convince you they need help.
These are some of the recent real-life AI scams.
As NPR reported, the FTC couldn't give an accurate estimate of how many people have been victimized by voice-cloning technology. However, The Washington Post reported this spring that "impostor scams" were the second-most popular scam in the U.S. in 2022. 5,100 such incidents happened through phone calls, many of which may have been enhanced by voice cloning.
NPR noted that at least eight Canadian senior citizens lost a total of $200,000 in a voice-cloning scam in 2023. Early in 2020, a bank manager in Hong Kong transferred large amounts of money based on a scammer using voice cloning technology.
How can you avoid falling for AI voice-cloning scams?
If you receive a panicked call from someone who claims to be — and sounds just like — a loved one requesting help, hang up. Then you can contact the person using a trusted phone number so that you'll know it's really them. You can try another relative or friend if you can't get in touch with the exact person to verify the story.
The FTC warns of other red flags in the ways people might ask you to send money:
- wire transfer
- buying gift cards and sending them the numbers and PINs
- any other way that would make it hard to get the money back
The Post noted that forensics professor Hany Farid says voice-cloning technology is getting more accurate, able to analyze unique factors about people's voices. Since many people post voice content to public social media platforms, obtaining a sample isn't hard. “Now … if you have a Facebook page … or if you’ve recorded a TikTok and your voice is in there for 30 seconds, people can clone your voice.”
It's very hard for authorities to track down these types of scammers, especially if victims send money through untraceable means like cryptocurrency or gift cards. Therefore, the best way to combat AI scams is to be wary of anyone asking for money, even if it sounds like someone you know. It only takes minutes to hang up and call the person to find out the truth.