Facebook and Instagram violated Palestinian rights during Gaza war, internal report finds
Facebook and Instagram’s policies during Israeli attacks on Gaza last year harmed the fundamental human rights of Palestinians, according to a report commissioned by the platforms’ parent company, Meta.
The study, which was carried out by consultancy Business for Social Responsibility (BSR), was obtained by The Intercept, ahead of its publication later this week.
In May 2021, Meta was widely accused of censorship and bias during Israel’s assault on Gaza, which killed 256 Palestinians, including 66 children. In Israel, 13 people were killed by Palestinian rockets, including two children.
“Meta’s actions in May 2021 appear to have had an adverse human rights impact… on the rights of Palestinian users to freedom of expression, freedom of assembly, political participation, and non-discrimination, and therefore on the ability of Palestinians to share information and insights about their experiences as they occurred,” the report said.
Middle East Eye reported last year that concerns had been raised about deleted social media content and account suspensions in relation to the Sheikh Jarrah neighbourhood in occupied East Jerusalem, where an Israeli crackdown on protests against forcible evictions sparked the escalation in violence.
Stay informed with MEE's newsletters
Sign up to get the latest alerts, insights and analysis, starting with Turkey Unpacked
According to the report, Meta deleted Arabic content related to last year’s violence at a much greater rate than Hebrew-language posts. This was found among posts reviewed by both automated services and employees.
BSR attributed the difference in treatment to a lack of expertise. It concluded that Meta lacked staff who understood other cultures, languages and histories - despite having over 70,000 employees and $24bn in cash reserves.
“Potentially violating Arabic content may not have been routed to content reviewers who speak or understand the specific dialect of the content,” the report stated.
During the outbreak of violence, Palestinian and Arab social media users revived an old Arabic font in an attempt to beat Facebook’s algorithm and express their support for Palestine.
The BSR report fell short of accusing Meta of deliberate bias, pointing instead to “unintentional bias” leading to “different human rights impacts on Palestinian and Arabic-speaking users”.
It found that Meta’s Dangerous Individuals and Organisations policy, referred to in the report as the DOI, which prevents its users from praising or representing a number of groups, focused mainly on Muslim entities and therefore disproportionately impacted Palestinians.
“Meta’s DOI policy and the list are more likely to impact Palestinian and Arabic-speaking users, both based upon Meta’s interpretation of legal obligations, and in error,” it said.
“Palestinians are more likely to violate Meta’s DOI policy because of the presence of Hamas as a governing entity in Gaza and political candidates affiliated with designated organisations.”
The study concluded with 21 non-binding recommendations, which included increasing staff capacity to analyse Arabic posts and reforming the Dangerous Individuals and Organisations policy.
Meta vaguely committed to implementing 20 of the recommendations, according to The Intercept. The company did not immediately respond to a request for comment. Its response to the report will be included when the document is published in full.
In a footnote to the response seen by The Intercept, it said: “Meta’s publication of this response should not be construed as an admission, agreement with, or acceptance of any of the findings, conclusions, opinions or viewpoints identified by BSR, nor should the implementation of any suggested reforms be taken as admission of wrongdoing.”
Middle East Eye delivers independent and unrivalled coverage and analysis of the Middle East, North Africa and beyond. To learn more about republishing this content and the associated fees, please fill out this form. More about MEE can be found here.