Skip To Content

How can YouTube and Reddit successfully fight Holocaust denial, but not Facebook?

While Facebook continues to struggle with Holocaust denial on its platform, policies implemented by YouTube and Reddit show that eliminating anti-Semitism on social media is still possible, a British think tank claimed in a report released last week.

The document, from the Institute for Strategic Dialogue, tracked the frequency of the word “Holohoax” – a common term for Holocaust deniers – on various social media platforms over the past two years. It found that Facebook, the world’s largest social media network, “provide[s] a home to an established and active community of Holocaust deniers,” whereas such conspiracy theories had significantly decreased on YouTube after a policy change by the Google-run company in 2019.

The report found 36 Facebook groups specifically dedicated to Holocaust denial, and argued that the platform’s algorithm created a “snowball” effect whereby people who like one piece of anti-Semitic content will be recommended similar pages.

Facebook has been criticized for years for its laissez-faire approach to Holocaust denial. CEO Mark Zuckerberg said in a 2018 interview that he was opposed on principle to removing many forms of hate speech from the social network. “I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive,” he told Recode’s Kara Swisher. “But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong.”

That stance has been criticized by several civil rights organizations. Among those has been the the Anti-Defamation League, which is among the leaders of a campaign to boycott Facebook until it changes its practices. “Holocaust Denial is a despicable, antisemitic conspiracy theory that Jews hoaxed the entire world and @ISDglobal’s research reinforces what we know to be true: Facebook not only profits off hate, they amplify and recommend it,” he wrote on Twitter.

Facebook did announce last week that it would change its policies to ban conspiracy theories about supposed Jewish global domination, as well as images of blackface. However, its announcement made no mention of any changes in its policy towards Holocaust denial.

Two platforms that have made such changes are YouTube and Reddit — and according to the Institute for Strategic Dialogue, they appeared to be relatively successful.

The video company announced algorithm and policy changes last year designed to remove or minimize extremist and racist videos, including bans on videos promoting Nazism and Holocaust denial. After the changes were made, “the spread of Holocaust denial content dropped significantly on YouTube,” the think tank found.

Additionally, YouTube announced in July that it had removed 25,000 accounts for violating its hate speech rules, including ones belonging to white nationalists Richard Spencer and David Duke.

If anything, the changes may have been overly broad. In 2019, a high school history teacher’s videos about World War II were also deleted, as were an anti-racism researcher’s videos documenting extremists’ anti-Semitic activity. Both accounts were reset after YouTube responded to media reports on the incidents.

The Institute for Strategic Dialogue also considered Reddit a qualified success. Although the frequency of Holocaust-denial posts stayed relatively constant between 2018 and 2020, the think tank noted that Reddit employees and volunteer moderators had been relatively successful at minimizing their spread. The report noted that the company had successfully banned several users and “subreddits,” or message boards, that had concentrated on Holocaust denial, and praised its users for being proactive about pushing back against Holocaust-denial posts via comments and “downvoting.”

In a statement, a Reddit spokesperson noted that at the end of June — the very tail end of the think tank’s study, which tracked content through July 2020 — it had changed its content policy to prohibit content that promotes hate “based on identity or vulnerability,” and had in the process banned thousands of subreddits, including many anti-Semitic ones. “Additionally, we have dedicated teams that enforce our site-wide policies, proactively go after bad actors on the site, and have built internal tooling to detect and remove policy-breaking content,” the spokesperson said.

Correction, August 19: A previous version of this article said that Reddit changed its policies in June to prohibit content that promotes violence or hate. In fact, promoting violent content was already forbidden before the change was made.

Aiden Pink is the deputy news editor of the Forward. Contact him at [email protected] or follow him on Twitter @aidenpink


Republish This Story

Please read before republishing

We’re happy to make this story available to republish for free, unless it originated with JTA, Haaretz or another publication (as indicated on the article) and as long as you follow our guidelines. You must credit the Forward, retain our pixel and preserve our canonical link in Google search.  See our full guidelines for more information, and this guide for detail about canonical URLs.

To republish, copy the HTML by clicking on the yellow button to the right; it includes our tracking pixel, all paragraph styles and hyperlinks, the author byline and credit to the Forward. It does not include images; to avoid copyright violations, you must add them manually, following our guidelines. Please email us at [email protected], subject line “republish,” with any questions or to let us know what stories you’re picking up.

We don't support Internet Explorer

Please use Chrome, Safari, Firefox, or Edge to view this site.