ECONOMY & WORK
MONEY 101
NEWS
PERSONAL FINANCE
NET WORTH
About Us Contact Us Privacy Policy Terms of Use DMCA Opt-out of personalized ads
© Copyright 2023 Market Realist. Market Realist is a registered trademark. All Rights Reserved. People may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.
MARKETREALIST.COM / ECONOMY & WORK

Scam Calls Using AI Voice Cloning Are On The Rise; Here's How To Stay Protected

The imposters have got so convincing with fake voices that even the experts find it hard to detect.
PUBLISHED FEB 3, 2024
Scam calls use AI voice cloning | Pexels | Photo by Hassan OUAJBIR
Scam calls use AI voice cloning | Pexels | Photo by Hassan OUAJBIR

While Artificial Intelligence is being used to create wonders like chatbots, it is also being used to scam people across the globe. Before the advent of generative AI, deepfakes were already prevalent. However, the advancement in generative AI tools has allowed scammers to find new ways to fool people. The imposters have got so convincing with fake voices that even the experts find it hard to differentiate. Thus, fake AI voice calls are increasingly racking up victims across the globe. However, there are ways through which the public can protect themselves and possibly even prevent such scams.

 Representative Image | Getty Images | Photo by Andrea Verdelli
Getty Images | Photo by Andrea Verdelli

The advanced generative AI tools of today can imitate anyone’s voice or appearance to near perfection. In some cases, imposters pretend to be a family member of a victim who is distressed and needs money. In other cases, it can be celebrities or politicians circulating fake information, like in the most recent scam targeting US President Joe Biden.

As per an NBC report, in the recent scam, a robocall impersonating President Biden is urging voters in New Hampshire to not participate in presidential primary. The fraudulent robocall instructs the recipients to save their vote for the November 2024 election in a voice that sounds like President Biden’s.



 

The concept of AI voice cloning tools is simple. It uses someone’s voice to train itself and generate audio similar to the subject. Scammers get a hold of this data in the form of videos and audio shared by people on social media which act like treasure troves of information for them.



 

1. Verify authenticity

For AI voice scams involving celebrities, it is always best to fact-check if what they said was real. For instance, in the case of the recent spam call involving President Biden, a simple Google search will yield results of multiple publications debunking it.

2. Scrutinize calls from unknown numbers who claim to be family/friends

For AI voice scams imitating distressed family members or friends, the first thing that people should check is if the caller is the real person. In the case of scammers using sophisticated technology, it can be hard to differentiate but clues like a way of talking, mannerisms, certain words, pitch, accent and more may give it away. However, in case victims are asked to send money urgently, it is best to take a moment and verify the information thoroughly.

3. Set up a password

Modern problems require modern solutions. While it may sound nerdy, it is indeed wise to have a family password that can quickly verify a caller’s identity. A simple word or phrase can be used for this and it can come in handy in situations of actual distress.

4. Seek help of authorities 

In case someone comes across a suspicious call from an unknown number, it is best to reach out to the authorities as soon as possible. Even if someone’s family member or friend is in trouble, the authorities can always help.

5. Be mindful of what you share 

A non-celebrity person’s social media feed primarily determines if they can be the subject of an AI voice cloning scam or not. Social media accounts carry information on friends, family, and media like videos, photos, and audio, all of which can be used by scammers. Thus, it is recommended to avoid sharing lengthy videos containing speech or conversations and engage the privacy lock on social media accounts to prevent random people from accessing media.

POPULAR ON MARKET REALIST
MORE ON MARKET REALIST