ECONOMY & WORK
MONEY 101
NEWS
PERSONAL FINANCE
NET WORTH
About Us Contact Us Privacy Policy Terms of Use DMCA Opt-out of personalized ads
© Copyright 2023 Market Realist. Market Realist is a registered trademark. All Rights Reserved. People may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.
MARKETREALIST.COM / PERSONAL FINANCE

Here's How And Where Victims Of AI Deepfake Scams Can Seek Remedy

Even with the increasing number of scams jurisdictions are falling behind in regulating AI
PUBLISHED FEB 29, 2024
Facial-recognition technology is operated at Argus Soloutions | Getty Images | Photo by Ian Waldie
Facial-recognition technology is operated at Argus Soloutions | Getty Images | Photo by Ian Waldie

Generative Artificial Intelligence tools have turned out to be both useful and harmful for people in recent times. These powerful programs can create image, video, and voice replicas of real people which are now increasingly being exploited by scamsters across the globe. Recently, a Hong Kong-based company lost $25 million in an AI Deepfake scam. However, even with the increasing number of such scams, jurisdictions across the world are falling behind in regulating AI and offering remedies to the victims of such scams. However, there are a few things that victims can do, depending on the nature of the scam and their country of residence.



 

Victims of scams that are promoted on social media can technically seek damages from the platform that hosted the scam. This, however, may not work in the US as platforms are explicitly shielded from liability because they act as conduits of content only.

In other countries, like Australia, this protection does not exist. According to a Tech Explore report, the Australian Competition and Consumer Commission (ACCC) has taken Meta to court to hold its social media platforms (Facebook, Instagram) liable for deepfake crypto scams if they actively target the ads to possible victims, according to the report. 

Thus, it might soon become a legal obligation for the platform to be accountable for scams promoted through them.

Victims can also seek damages from banks and ask for a payment reversal or refund of the lost amount. For instance, in the UK, a pensioner who lost his entire savings to a crypto scam was refunded by his bank. In this case, it was argued that banks have some inherent responsibility to protect their customers from scams.



 

Further in Australia, the ACCC and others have presented proposals for a scheme that would mandate banks to reimburse victims of payment scams at least in some circumstances.

Currently, the makers of generative AI tools are not legally obliged to take any responsibilty if their tools are used for fraud or deception. They have no duty or responsibility towards protecting people from the dangerous or unlawful use of their tools.

However, this may change soon, as the recently proposed EU AI Act obligates the providers of generative AI tools to design their tools in such a way that the content generated by the tools can be easily detected.

The proposed legislation could force companies to add digital watermarking, and other means to protect people. However, the effectiveness of the proposed act is still being debated. 



 

Prevention is better than cure thus, people should take all the necessary steps to safeguard themselves from deepfake scams. Here are a few things that people can do to be safe.

1. Business owners should ensure that all of their employees are well versed with deepfake content and the latest scams.

2. People should make sure that their family members, especially the less tech-savvy people know about deepfake scams and how they work.

3. It is advised to keep a tab on the latest deepfake scams and other such scams that are circulating on social media or other platforms.

4. Always verify any call or message that involves making a monetary transaction.

POPULAR ON MARKET REALIST
MORE ON MARKET REALIST