About Us Contact Us Privacy Policy Terms of Use DMCA Opt-out of personalized ads
© Copyright 2023 Market Realist. Market Realist is a registered trademark. All Rights Reserved. People may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.

Now, AI-Backed Scams Are Targeting News Editors with Elaborate Copyright Threats

Scammers are looking to dupe news editors into providing links for their bogus websites.
Cover Image Soure: Pexels | Photo by Rodrigo Santos
Cover Image Soure: Pexels | Photo by Rodrigo Santos

The advent of generative artificial intelligence has opened multiple avenues of innovation. While AI has now become a creator of content, it also serves as an optimal tool for data analysis. With this, it is no surprise that AI-backed scams have also emerged to rob unsuspecting victims on the internet. A recent AI-backed scam is looking to dupe news editors into providing links for their bogus websites within their content.


The deputy editor for Cointelegraph's Asia-Pacific news desk based in Sydney recently alleged that an alleged law firm emailed a “DMCA Copyright Infringement Notice”, claiming that an article of the publication had used a copyrighted stock image owned by a vague cryptocurrency firm.

However, the image was not even present in the mentioned article. Yet, the alleged law firm hit the editor with another email claiming to be working on behalf of a different AI-backed crypto platform.

The threat was demanding that Cointelegraph link the content to the alleged owner’s website. A person named “Alicia Weber,” a purported employee for “Nationwide Legal Services,” issued the threat and gave the editor five days to provide a link or else she would file a copyright lawsuit.

Image source: Pexels
Image source: Pexels

The scammer further claimed that removing the image would not rectify the issue and demanded the editor to include a link to the “notable entity” and “prominent organization” mentioned by her.

The editor mentioned that the website of the law firm has a .site instead of a .com extension, which is a clear red flag. The editor also noticed that the headshots of the so-called lawyers on the firm’s website looked like they were generated by AI or were deep fakes. Also, the corporate headshots of the vague crypto firm's “dream team” had AI-generated hallmarks and glossed-over eyes.


Meanwhile, the second fake firm named in the email had mentioned on its website’s team page “Our AI Generated Cyborg Team.” Both of the websites contained a huge amount of AI-generated content to make them look authentic.

Unsuspecting editors, who are too busy to crosscheck, may easily provide backlinks to these websites thinking they are genuine publishers and to stave off a potential lawsuit. Thus, it is important to conduct thorough research before accepting any demand coming from a dubious email.

However, the editor claimed that the two websites did not have any other scams running. Even signing up with an email on the website did nothing for the editor.

The scam mentioned in the Cointelegraph report shows similarities to the lazier phishing scams that operated on X (formerly Twitter). In those scams, automated AI bots posted links to Google forms to collect seed phrases.

Another example of an AI-backed scam is an investment scam where scammers are presenting fake investment opportunities backed by AI-generated data to lure unsuspecting investors into putting their money in fraudulent schemes.


While there have been useful chatbots like Chat-GPT, BARD, and more, dubious AI-powered chatbots are also used by scammers to disguise as genuine service providers and gather personal information, sensitive financial information or even convince unsuspecting victims to transfer money into their accounts.