Skip to main content

How Facebook threatens vulnerable Muslim communities

The social media platform has been used to incite and condone violence against adherents of the Islamic faith, from Myanmar to Kashmir to Palestine
The social media giant Facebook "has now turned into a beast" (AFP)

The social media giant Facebook poses an existential threat to vulnerable Muslim communities.

This assessment is based on how Facebook has failed to prevent its platform from being used to incite mob violence against adherents of the Islamic faith. Palestinian and Kashmiri human rights activists have long complained of having their accounts suspended or permanently deleted after posting videos of Indian and Israeli soldiers carrying out human rights violations.

"Facebook has now turned into a beast, and not what it originally intended," said Yanghee Lee, a UN investigator who in 2018 described the social media platform as a vehicle for inciting "acrimony, dissension and conflict" and driving the Rohingya Muslim genocide in Myanmar.

Defying 'community standards'

A recent investigation by the Wall Street Journal has revealed that when it comes to the safety and wellbeing of vulnerable Muslim minorities, Facebook not only puts profits and politics before social and moral responsibility, but also before its stated user policies or what it calls "community standards" - as evidenced by how it refused to punish a right-wing Indian politician for advocating violence against Muslims because doing so would be bad for the company's business.

Stay informed with MEE's newsletters

Sign up to get the latest alerts, insights and analysis, starting with Turkey Unpacked


"We do not allow hate speech on Facebook because it creates an environment of intimidation and exclusion and in some cases may promote real-world violence," reads the site's hate-speech policy. "We define hate speech as a direct attack on people based on what we call protected characteristics - race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability."

These revelations should be seen not as an isolated incident, but rather in the broader context of Facebook managing its business in a way that puts it in lockstep with the Hindu nationalist agenda

When T Raja Singh, a member of India's ruling BJP, called for the slaughter of Rohingya Muslim refugees, threatened to demolish mosques, and labelled Indian Muslim citizens as traitors, Facebook's online security staff determined his account should be banned for not only violating its community standards, but also for falling under the category of "Dangerous Individuals and Organizations".

That's when Ankhi Das, Facebook's public policy director for India, stepped in to protect Singh from punitive action, because "punishing violations by politicians from Mr Modi's party would damage the company's business prospects in the country," according to Facebook employees cited by the Wall Street Journal.

Turning a blind eye

These revelations should be seen not as an isolated incident, but rather in the broader context of Facebook managing its business in a way that puts it in lockstep with the Hindu nationalist agenda - because India, with its more than 290 million Facebook users, represents a key market. 

Facebook did not respond to Middle East Eye's request for comment.

Debunking five social media myths about Muslims and coronavirus
Read More »

"For years now, verified Facebook pages of BJP leaders such as Kapil Mishra have routinely published hate speeches against Muslims and dissenting voices. The hate then translates into deadly violence, such as the anti-Muslim attacks in Delhi that left many people dead in February in some of the worst communal violence India's capital had seen in decades," observed Indian journalist Rana Ayyub. "... It's clear that Facebook has no intention of holding hate-mongers accountable and that the safety of users is not a priority."

In June, after The Gambia requested in a US District Court for Facebook to release "all documents and communications produced, drafted, posted or published on the Facebook page" of Myanmar military officials and security forces, in order to evaluate what role they played in the mass violence against the Rohingya, Facebook indicated that it would evaluate the request.

Standing with powerful states

The hopes of Rohingya activists were buoyed when Facebook's head of cybersecurity policy, Nathaniel Gleicher, acknowledged the company had found "clear and deliberate attempts to covertly spread propaganda that were directly linked to the Myanmar military".

"I wouldn't say Facebook is directly involved in the ethnic cleansing, but there is a responsibility they had to take proper action to avoid becoming an instigator of genocide," Thet Swe Win, who founded Synergy, a group devoted to encouraging social cohesion in Myanmar, told the New York Times.

Rohingya refugees are pictured at a camp in southern Bangladesh on 11 December (AFP)
Rohingya refugees pictured at a camp in southern Bangladesh on 11 December 2019 (AFP)

This month, however, Facebook rejected The Gambia's request, arguing that the release of "all documents and communications" by key military officials and police forces was "extraordinarily broad" and would constitute "special and unbounded access" to accounts. 

The profit motive apparently drives Facebook to stand with powerful states and against the victimised and downtrodden. The idea that Facebook is an impartial platform built on fairness and equality for all is patently absurd, given that it is a for-profit corporation that bases its commercial decisions on the quest for ever-higher revenues.

'Only Muslims get blocked'

There is much evidence of this in both India and Israel/Palestine. A 2019 report noted that WhatsApp, the messaging app now owned by Facebook, blocked or shut down around 100 accounts belonging to Palestinian journalists and activists, banning them from sharing information and updates as Israeli warplanes pounded Gaza in November 2019.

Facebook has also been accused of showing favouritism to Israel by categorising vague or even commonly used Arabic terms or slogans as "incitement to violence," while simultaneously turning a blind eye to Israeli accounts that openly call for "death to Arabs". Facebook has revealed a "political bias in favour of elevating the Israeli narrative while suppressing the Palestinian one," observed +972 Magazine.

The international community is oriented towards the economic and strategic interests of non-Muslim majority countries, where the social media giant exacts the lion's share of its profits

Marwa Fatafta, a Palestinian writer and policy analyst, says that Facebook "cannot use ignorance as an excuse," noting that "economic and political incentives" explain why social media companies comply with Israeli government requests.

In Kashmir as well, journalists and human rights activists have for years accused Facebook of censoring content that casts Indian security forces in a negative light. Four weeks after India revoked Kashmir's autonomous status in August 2019, Facebook suspended scores of accounts over posts on the disputed territory, including "Stand With Kashmir," a page owned and managed by a Kashmiri American based in Chicago.

"Why is it that only Muslims get blocked? Facebook is being one-sided by supporting the atrocities committed by the Indian army. Other people can say whatever they want, but if Muslims say something, we get blocked. It is not neutral," Rizwan Sajid, a Kashmiri activist, told the Guardian.

A clear choice

What's clear is that Facebook, like much of the international community, appears to hold a bias against Muslims, because the international community is oriented towards the economic and strategic interests of non-Muslim majority countries, where the social media giant exacts the lion's share of its profits.

Amarnath Amarasingam is an expert in violent extremism and the author of Sri Lanka: The Struggle For Peace in the Aftermath of War. He told MEE: "Many of the more frontline people at Facebook dealing with hate speech and incidents of violence against minorities - often Muslims - in places like India, Myanmar [and] Sri Lanka are quite knowledgeable, and I've found them to be eager and open when it comes to reaching out to experts and learning about the ground realities. I fear that at the leadership level, different calculations are at play."

When it comes to the choice between social responsibility and responsibility to shareholders, it would appear that Facebook is eschewing measures that might impede delivering greater profits to the latter.

The views expressed in this article belong to the author and do not necessarily reflect the editorial policy of Middle East Eye.

This article is available in French on Middle East Eye French edition.

CJ Werleman is a journalist, columnist and analyst on conflict and terrorism.
Middle East Eye delivers independent and unrivalled coverage and analysis of the Middle East, North Africa and beyond. To learn more about republishing this content and the associated fees, please fill out this form. More about MEE can be found here.