Skip To Content
JEWISH. INDEPENDENT. NONPROFIT.
Fast Forward

Meta is failing to catch memes and innuendo promoting Holocaust denial, oversight panel concludes

An Instagram post using a SpongeBob SquarePants meme to promote Holocaust denial stayed up despite six complaints from users that generated four automated reviews and two human assessments

(JTA) – An Instagram post using a SpongeBob SquarePants meme to promote Holocaust denial managed to evade Meta’s system for removing such content, raising questions about the company’s ability to combat certain indirect forms of hate speech, an independent oversight panel concluded in a case published Tuesday.

The finding came in a review of Meta’s handling of a post featuring a meme of Squidward, a character from the cartoon series SpongeBob SquarePants, entitled “Fun Facts about the Holocaust.” A speech bubble next to the character contained lies and distortions about the Holocaust, including false claims that 6 million Jews could not have been murdered and that chimneys of the crematoria at Auschwitz were built only after World War II.

Withstanding six complaints from users that generated four automated reviews and two human assessments, the post stayed up from September 2020 until last year when Meta’s Oversight Board decided it would examine the situation and the company subsequently announced the post violated its policy against hate speech. The post even survived two user complaints that came after Meta’s adoption in October 2020 of a new rule expanding on its hate speech policy to explicitly bar Holocaust denial.

As part of its review in the SpongeBob case, the Oversight Board commissioned a team of researchers to search for Holocaust denial on Meta’s platforms. It was not hard to find examples, including posts using the same Squidward meme to promote other types of antisemitic narratives. Users try to evade Meta’s detection and removal system, the researchers found. Vowels are replaced with symbols, for example, and cartoons and memes offer a way to implicitly deny the history of the Holocaust without directly saying it didn’t happen.

Content moderation on social media is a notoriously difficult task. Some platforms, such as X, formerly known as Twitter, have taken a more hands-off approach, preferring to reduce oversight rather than err and risk stifling legitimate speech, which has resulted for X in the proliferation of antisemitism and extremism.

Meta has gone in the direction of increased moderation, even agreeing to empower the Oversight Board to make binding decisions in disputes over violations of the platform’s content policies. Meta’s vigilance has led in some cases to the accidental removal of content intended to criticize hate speech or educate the public about the Holocaust. The Oversight Board has repeatedly urged Meta to refine its algorithms and processes to reduce the chance of mistakes and improve transparency when it comes to enforcing the ban on Holocaust denial.

Meta set up its oversight panel amid mounting controversy over how the company handles content moderation. The move opened Meta to new scrutiny but did not stop the criticism. When, for example, the panel announced the SpongeBob case and requested comment from the public, it received a flurry of critical responses from Jewish groups and others.

“Holocaust denial and distortion on Meta platforms, on any forum, online or offline, is unequivocally hate speech,” the Anti-Defamation League wrote in its comment. “The Oversight Board must direct Meta to act accordingly by quickly removing violating posts like the one at issue in this case. Waiting for appeals to rise to the Oversight Board is unacceptable.”

Meanwhile, another commenter named the ADL as a negative influence on Meta’s content moderation practices, referring to the group as a “non-representative, Democratic-Leftist organization.”

“Meta has lost all integrity in this area given Metas serious level of hyper-partisan Orwellian-level censorship,” wrote the commenter, who identified themselves as Brett Prince.

In the SpongeBob case, the oversight panel made two new sets of recommendations for Meta. It said that assessing how well Meta is enforcing its ban on Holocaust denial is difficult because human moderators do not record the specific reason they removed a piece of content, a practice the panel urged Meta to change.

The panel also learned that as of May 2023, Meta was still sorting reviews based on an automation policy enacted following the outbreak of the COVID pandemic so that users’ appeals of decisions by users were rejected unless deemed “high-risk.” It is inappropriate to have kept the policy in place for so long, the panel said.

The spread of hate and misinformation on social media has become an acute problem in the aftermath of Hamas’s Oct. 7 attack on Israel, as platforms become flooded with depictions of graphic violence and war-related propaganda. In response to the circumstances, the Oversight Board adopted an expedited process for reviewing disputed cases.

This article originally appeared on JTA.org.

A message from our CEO & publisher Rachel Fishman Feddersen

I hope you appreciated this article. Before you go, I’d like to ask you to please support the Forward’s award-winning, nonprofit journalism during this critical time.

At a time when other newsrooms are closing or cutting back, the Forward has removed its paywall and invested additional resources to report on the ground from Israel and around the U.S. on the impact of the war, rising antisemitism and polarized discourse..

Readers like you make it all possible. Support our work by becoming a Forward Member and connect with our journalism and your community.

—  Rachel Fishman Feddersen, Publisher and CEO

Join our mission to tell the Jewish story fully and fairly.

Republish This Story

Please read before republishing

We’re happy to make this story available to republish for free, unless it originated with JTA, Haaretz or another publication (as indicated on the article) and as long as you follow our guidelines. You must credit the Forward, retain our pixel and preserve our canonical link in Google search.  See our full guidelines for more information, and this guide for detail about canonical URLs.

To republish, copy the HTML by clicking on the yellow button to the right; it includes our tracking pixel, all paragraph styles and hyperlinks, the author byline and credit to the Forward. It does not include images; to avoid copyright violations, you must add them manually, following our guidelines. Please email us at editorial@forward.com, subject line “republish,” with any questions or to let us know what stories you’re picking up.

We don't support Internet Explorer

Please use Chrome, Safari, Firefox, or Edge to view this site.

Exit mobile version