Now, AI-Backed Scams Are Targeting News Editors with Elaborate Copyright Threats
The advent of generative artificial intelligence has opened multiple avenues of innovation. While AI has now become a creator of content, it also serves as an optimal tool for data analysis. With this, it is no surprise that AI-backed scams have also emerged to rob unsuspecting victims on the internet. A recent AI-backed scam is looking to dupe news editors into providing links for their bogus websites within their content.
Scammers play a long game using bogus, AI-backed 'law firm'
— Eric Casey (@casey_eric53053) November 7, 2023
Is artificial intelligence the best invention for scammers since the internet began? Here's one story of an AI-generated law firm sending threats to win backlinks.
A new artificial intelligence-backed scam is as… pic.twitter.com/S0Pyex3IwT
What is the AI-backed DMCA Scam?
The deputy editor for Cointelegraph's Asia-Pacific news desk based in Sydney recently alleged that an alleged law firm emailed a “DMCA Copyright Infringement Notice”, claiming that an article of the publication had used a copyrighted stock image owned by a vague cryptocurrency firm.
However, the image was not even present in the mentioned article. Yet, the alleged law firm hit the editor with another email claiming to be working on behalf of a different AI-backed crypto platform.
What were the demands of the scammers?
The threat was demanding that Cointelegraph link the content to the alleged owner’s website. A person named “Alicia Weber,” a purported employee for “Nationwide Legal Services,” issued the threat and gave the editor five days to provide a link or else she would file a copyright lawsuit.
The scammer further claimed that removing the image would not rectify the issue and demanded the editor to include a link to the “notable entity” and “prominent organization” mentioned by her.
How to spot such a scam?
The editor mentioned that the website of the law firm has a .site instead of a .com extension, which is a clear red flag. The editor also noticed that the headshots of the so-called lawyers on the firm’s website looked like they were generated by AI or were deep fakes. Also, the corporate headshots of the vague crypto firm's “dream team” had AI-generated hallmarks and glossed-over eyes.
Now we are in the territory where its hard to tell any more if this is a photo or AI generated.
— Linus (●ᴗ●) (@LinusEkenstam) December 12, 2023
Critique this. pic.twitter.com/aLqvaITKDH
Meanwhile, the second fake firm named in the email had mentioned on its website’s team page “Our AI Generated Cyborg Team.” Both of the websites contained a huge amount of AI-generated content to make them look authentic.
Unsuspecting editors, who are too busy to crosscheck, may easily provide backlinks to these websites thinking they are genuine publishers and to stave off a potential lawsuit. Thus, it is important to conduct thorough research before accepting any demand coming from a dubious email.
However, the editor claimed that the two websites did not have any other scams running. Even signing up with an email on the website did nothing for the editor.
Other AI-backed scams
The scam mentioned in the Cointelegraph report shows similarities to the lazier phishing scams that operated on X (formerly Twitter). In those scams, automated AI bots posted links to Google forms to collect seed phrases.
Another example of an AI-backed scam is an investment scam where scammers are presenting fake investment opportunities backed by AI-generated data to lure unsuspecting investors into putting their money in fraudulent schemes.
Malicious Abrax666 AI Chatbot Exposed as Potential Scam https://t.co/9Rh5buiJLF
— Nicolas Krassas (@Dinosn) November 14, 2023
While there have been useful chatbots like Chat-GPT, BARD, and more, dubious AI-powered chatbots are also used by scammers to disguise as genuine service providers and gather personal information, sensitive financial information or even convince unsuspecting victims to transfer money into their accounts.