Is Facebook biased against Palestinians?
During the most recent Gaza conflagration, both Israelis and Palestinians hurled accusations of censorship at social media giants, including Facebook and Instagram. Palestinians said that their posts about Israeli soldiers entering the al-Aqsa mosque complex in Jerusalem were disappearing without even a warning, and so were other posts accusing Israel of violence.
Meanwhile, Israeli influencers noted lower than usual views on posts defending Israel’s actions in Gaza or East Jerusalem, and complained that posts provoking violence against Israelis were left up.
Now, Facebook’s own oversight board is wondering if there’s something to these criticisms.
Facebook, like every major social media company, faces enormous challenges in moderating content on its platform. In the past few years, news outlets such as The Verge, Radiolab and NBC have made major investigations into the moderation process for the social media giant, which involves both artificial intelligence and thousands of employees sourced through third-party companies. These employees look through hundreds of violent or offensive images a day, making decisions that require a nuanced understanding of Facebook’s policies and the context of posts.
It’s a fallible system, so an independent board, with separate funding and administration from Facebook, was created to oversee moderation decisions and ensure that they didn’t infringe on freedom of speech or allow harm. The board, which consists of 40 people from diverse backgrounds and with a wide set of expertise, reviews a selection of “highly emblematic” cases to determine if the moderation choices were in line with Facebook’s policies, and, if not, to overturn them. Civilians can also submit public comments to the board during the review process. During its first review, in February, the board overturned five out of six decisions.
In a decision published September 14, the board expressed concern about failures of the moderation process with regard to the Israeli-Palestinian conflict. The concerns were expressed in the response to a case about an Egyptian man who reposted an al-Jazeera article on May 10, which he captioned “ooh”; the original article, which was not removed from al-Jazeera’s page, mentioned a threat of violence from the Al-Qassam Brigade, Hamas’s military wing, if Israeli forces did not withdraw from the al-Aqsa compound.
Facebook could not explain why the Egyptian man’s post was removed — moderators do not have to cite reasons for their decisions — but two, including one native Arabic speaker, had agreed that it should be taken down. The board ruled that “ooh” was a neutral caption that did not endorse the Al-Qassam Brigade and the post should be reinstated.
They also highlighted allegations that Facebook disproportionately removes Palestinian content, but not content that forments anti-Arab and anti-Israeli violence.
Additionally, the board asked whether the Israeli government had made any requests, official or unofficial, to remove content relating to the conflagration with Gaza in April and May. Facebook said that no government had been involved in the removal of the specific post by the Egyptian man which was being reviewed in this case, but declined to provide further information about Israel’s association with Facebook more generally.
The oversight board can also make policy recommendations for Facebook, and in this case recommended that Facebook provide more transparency with regard to possible government involvement and suggested that the company “engage an independent entity not associated with either side of the Israeli-Palestinian conflict to conduct a thorough examination to determine whether Facebook’s content moderation in Arabic and Hebrew, including its use of automation, have been applied without bias.”
The decision also noted that the ability of “Palestinians and their supporters to express themselves is highly restricted,” citing this as part of the board’s concern regarding censorship of Palestinian and Arabic content.
Accusations of censorship in the Israeli-Palestinian conflict are common, but it is notable that Facebook’s own oversight board is validating those suspicions, and expressing concerns over bias in the moderation process specifically regarding Facebook’s approach to the conflict, and whether it was influenced by the Israeli government.
In May, Facebook told the Forward that they partner with local experts and organizations to help understand the context of rapidly changing situations and employ native language speakers who “understand local cultural context and nuances,” which is their practice for all controversial issues. But along with an understanding of cultural nuance can come cultural biases, and Facebook did not share how local experts and native speakers were chosen.
The board noted that this case was “among several appeals that concerned content relating to the conflict” — clearly, the system is still deeply flawed.
A message from our Publisher & CEO Rachel Fishman Feddersen
I hope you appreciated this article. Before you go, I’d like to ask you to please support the Forward’s award-winning, nonprofit journalism during this critical time.
We’ve set a goal to raise $260,000 by December 31. That’s an ambitious goal, but one that will give us the resources we need to invest in the high quality news, opinion, analysis and cultural coverage that isn’t available anywhere else.
If you feel inspired to make an impact, now is the time to give something back. Join us as a member at your most generous level.
— Rachel Fishman Feddersen, Publisher and CEO