A consortium of pro-Israel organizations has filed a class-action lawsuit against Facebook, seeking to force the social media giant to censor “incitement” and stop linking “terrorists” with one another. This is a 20th-century solution to a 21st-century problem, and will not work.
Fortunately, there is a better way.
There is no question that social media — chiefly Facebook, Twitter and Instagram — are now tools of terror. Not just Palestinian terrorists but also extremists of all varieties (including Jewish ones) use social media to post direct threats of violence, incendiary videos stripped of context and even explicit instructions on how to carry out attacks effectively.
Because of the decentralized nature of social media, the “stabbing intifada” has proceeded in a chaotic fashion, sometimes encouraged by Palestinian leaders and sometimes discouraged. One Israeli security expert aptly described the phenomenon as “an octopus with tentacles but no head.”The most recent tragedy in point: Richard Lakin, a 76-year-old, American immigrant to Israel who was a lifelong advocate of peace and coexistence, until he was murdered by three Palestinian terrorists on October 13. Shockingly, while Lakin lay in the hospital, his family discovered social media posts re-enacting the murder and providing instructions on how to repeat it.
Of course, social media has been a tool not just for terrorists but also for pro-democracy activists, pro-Israel activists and pro-peace activists, and for ordinary people around the globe to share information, network and exchange best practices. Lakin himself had 4,634 followers on Twitter, which he used along with Facebook to promote pro-peace and pro-coexistence messages.
Which is the first problem with the new lawsuit, filed by a group of right-leaning organizations led by Shurat HaDin Israel Law Center. If a multinational corporation is supposed to censor anti-Israeli speech, it will surely have to censor anti-Palestinian speech as well. In just the past few weeks, both pro-Israeli and pro-Palestinian people in my “friend” circles have posted incitements to violence, misleading photos and racist statements.
All these would surely have to be subject to the same standard, leading to a clampdown on speech on all sides.
Second is the question of definition. We can all agree on the easy cases of incitement. One post cited in the complaint, for example, reads, “We demand that our young people arrive at the Shuafat camp so thousands attack the crossing.” That is indeed an incitement to a specific act of violence.
But it is also already against Facebook’s rules, and would surely be removed if reported. Facebook also complies with restraining orders and other legal mechanisms to protect individuals from harassment, and it bans serial offenders.
The problem comes in more difficult cases. Is it “incitement” simply to post a gruesome photo? What if the poster says, vaguely, that “something must be done”? How about, “We must fight back”? Once again, pro-Israel as well as anti-Israel Facebookers say these things, and far, far worse.
In fact, most of the examples of “incitement” in the lawsuit’s complaint are clearly protected political speech, praising the “knife intifada” in general, and no different, legally speaking, from my friends’ repeated posts praising the Israel Defense Forces. Thus, Facebook already bans precisely the speech that should be banned, and the speech that is not banned should not be.
Finally, even if a broader set of guidelines were somehow appropriate, the last kind of entity I want censoring speech is a corporation like Facebook. Defining hate speech is hard enough; patrolling incitement is even harder. Facebook — and Google, for that matter — function much better as more or less transparent platforms that don’t sift out good speech from bad (if that were even technically possible). As for-profit corporations, they shouldn’t be in the business of restricting some speech while promoting other speech. Neither, for that matter, should the government.
The plaintiffs know all this, of course, leading one to infer that the entire lawsuit is, first and foremost, a publicity stunt, designed to gain attention (and donations) for the lawyers behind it.
This inference is borne out by the thin legal reasoning of the complaint, two-thirds of which is merely a list, complete with typos, of plaintiffs who joined by clicking a button, and the remainder of which reads like yellow journalism from the right-wing echo chamber.
Its sole legal reasoning — made mostly in the context of nonbinding Israeli law rather than American legal standards — is that, because Facebook’s automated algorithms link together like-minded people, the company is not a mere platform but is instead “facilitating, encouraging, and brokering the connections between and among these terrorist organizations and hundreds of thousands of individuals and organizations who have indicated through their Facebook pages that they sympathize with the terrorist-cause [sic].”
Of course, it also facilitates, encourages and connects groups of Esperanto speakers, model-plane builders and Girl Scouts. And the complaint’s request for injunctive relief — that Facebook monitor and remove all pages containing “incitement to murder Jews” and “cease serving as match-maker” for terrorists — fails all the tests listed above. Even the examples the complaint itself lists fall squarely within the gray area between actual incitement and legitimate (if offensive) political speech.
But while this is a transparent publicity stunt, thinly argued and legally frivolous, it’s not entirely wrong either.
Surely something must be done to prevent outrages like that following Lakin’s stabbing, and the daily posting of clear incitements to violence. And it’s worth remembering that some offenders are not individual tweeters but large organizations. The Times of Israel reported, for example, that the Facebook pages of the Islamic-Jihad-affiliated Quds News Network (3.6 million “likes”) and the Hamas-affiliated Shehab News Agency (4.1 million “likes”) often post horrifying images of Palestinian victims, accompanied by exhortations such as “Stab!” or “al-Aqsa is in danger!”
In other words, we’re not talking about kids at home on their computers here; we’re talking about the actual terrorist organizations that carry out acts of terror — or about their carefully distinguished but ultimately intertwined news organs. The solution, though, must be as decentralized as the problem.
If Israel’s supporters are serious about stopping online incitement, we must all get to work and report it ourselves. Facebook’s existing standards are strict enough to remove anti-Semitic hate speech and “fire in a crowded theater” type incitement. The trouble is, we’re not fast enough in reporting it.
Instead of raising money by filing a frivolous lawsuit, opponents of the “electronic intifada” should organize teams of Arabic-literate volunteers to navigate through pro-Palestinian social networks, monitor key Facebook and Twitter accounts, immediately report incitement and blast out “reporting actions” to their own social media cohorts. If each volunteer builds a cohort of a few hundred dedicated whistleblowers, speech calling for specific violent acts can be reported within minutes of being posted, before it can do harm.
The sharing of grievances, information and political views, however objectionable to some, is part of political speech in the 21st century. Fortunately, the existing dynamics of social media will weed out over-reporters; monitors who cry wolf each time someone posts an offensive image are wasting the time of their followers, and eventually no one will follow them. Meanwhile, those monitors who effectively target removable incitement and peel back the veil of legitimacy covering terrorist propaganda will gain credibility and followers.
True, the task will never be finished. Extremists will just set up new accounts and find new ways to spread the word. But crowd-sourced anti-terrorism campaigns will be a lot more successful than a ham-fisted demand for legal relief that smacks of censorship, partisanship and 20th-century reasoning.
If there is a silver lining here, it is that each of us is empowered to make a difference, if we’re willing to put in the effort. So if you are as outraged as I am by armchair terrorists encouraging the murder of Richard Lakin, don’t turn to the courts or to corporations. We have to fight crowdsourcing with crowdsourcing.
Jay Michaelson is a contributing editor to the Forward.