Rohingya Refugees Sue Meta for $150 Billion, Citing Facebook’s “Negligence” in Myanmar Crisis

The Rohingya Facebook lawsuit is seeking $150 billion from the social media company, alleging the company’s negligence fueled a genocide in Myanmar.

Dan Clarendon - Author
By

Dec. 7 2021, Published 9:29 p.m. ET

Rohingya refugees
Source: Getty Images

A combined lawsuit from Rohingya refugees in the U.S. and elsewhere in the world could make Facebook pay billions after hate speech proliferated on the platform and incited violence in Myanmar. According to the Associated Press, Rohingya refugees are suing Meta, Facebook’s parent company, for more than $150 billion, with lawyers filing a class-action suit in California on Dec. 6. Lawyers in the U.K, meanwhile, say they intend to file similar legal action.

Article continues below advertisement
Article continues below advertisement

The Rohingya, a Muslim ethnic group, has been fleeing violence in Myanmar for years now, with an estimated 1 million refugees living in camps in Bangladesh and around 10,000 others seeking refuge in the U.S, the Associated Press reports.

“Facebook’s negligence encouraged and facilitated the genocide carried out by the Myanmar regime”

The California lawsuit, published online, details the claimants’ grievances against Facebook. “Despite having been repeatedly alerted between 2013 and 2017 to the vast quantities of anti-Rohingya hate speech and misinformation on its system, and the violent manifestation of that content against the Rohingya people, Facebook barely reacted and devoted scant resources to addressing the issue,” the court filing reads.

Article continues below advertisement
meta sign
Source: Getty Images

“The resulting Facebook-fueled anti-Rohingya sentiment motivated and enabled the military government of Myanmar to engage in a campaign of ethnic cleansing against the Rohingya. To justify and strengthen its hold on power, the government cast, by and through Facebook, the Rohingya as foreign invaders from which the military was protecting the Burmese people.”

Article continues below advertisement
Article continues below advertisement

Mishcon de Reya LLP, one of the law firms representing the claimants outside the U.S., alleged in a statement that “Facebook’s negligence encouraged and facilitated the genocide carried out by the Myanmar regime and its extremist supporters against the Rohingya people.”

The law firm went on to say that “Facebook used algorithms that amplified hate speech against the Rohingya people on its platform. Facebook failed in its policy and in practice to invest sufficiently in content moderators who spoke Burmese or Rohingya or local fact-checkers with an understanding of the political situation in Myanmar.”

Article continues below advertisement

U.N. investigators also put blame on Facebook, which admitted it “can and should do more” to address the situation

In March 2018, seven months after a security crackdown in Myanmar’s Rakhine state intensified the Rohingya crisis, U.N. investigator Marzuki Darusman told Reuters and other reporters that social media had played a “determining role” in Myanmar and “substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public.” Darusman added, “Hate speech is certainly, of course, a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media.”

Article continues below advertisement

U.N. Myanmar investigator Yanghee Lee said, “Everything is done through Facebook in Myanmar. … It was used to convey public messages, but we know that the ultra-nationalist Buddhists have their own Facebooks and are really inciting a lot of violence and a lot of hatred against the Rohingya or other ethnic minorities … I’m afraid that Facebook has now turned into a beast, and not what it originally intended.”

Facebook eventually commissioned the Business for Social Responsibility to conduct an independent assessment of the social media platform’s impact in Myanmar, the results of which Facebook published online in Nov. 2018. And in a blog post accompanying the report, Facebook product policy manager Alex Warofka said the company agreed that it “can and should do more.”

Advertisement

Latest Facebook Inc News and Updates

    Opt-out of personalized ads

    © Copyright 2024 Market Realist. Market Realist is a registered trademark. All Rights Reserved. People may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.