What could happen when the antisemitism, racism and hate speech proliferating on Twitter, Facebook and Instagram transform from written posts to spoken, untraceable words?
This April marks a year since Clubhouse, the audio-only social networking platform, was first launched, allowing users to open “rooms” for conversations on topics of interest. In the last few months, the platform has gained a reputation as a thriving, popularized marketplace of — often Jewish — ideas and community. Rabbis become “breakout stars” on the app, observers opine that it holds “a glimpse of the Jewish future” and others say it’s become the “digital version of the Jewish conference circuit hallway.”
Everyone’s talking. One club, “Shabbat Shalom” has 12,300 members, and another, “Jewish Tribe and Friends” is coming up on 13,000. You can “Ask a rabbi,” you can ponder “Jewish life on campus,” and you can share your favorite Passover recipes. And, like on any other platform, you can encounter hate speech.
Last September, Tablet Magazine’s Yair Rosenberg reported an antisemitic “meltdown” on the app over Yom Kippur, where users in a 300-person room described Jews as the “the face of capital,” argued that “Jews control the banks,” and made antisemitic Holocaust revisionist statements. As Rosenberg put it then, this outcome felt inevitable: “An iron law of the internet is that it is only a matter of time between the creation of a social media platform and it being used to spread antisemitic conspiracy theories.”
Recently, Shireen Mitchell, an expert in tracking online harassment with the group Stop Online Violence Against Women, has been closely monitoring Clubhouse activities.
“We’ve been tracking online harassment since 2013,” she said, “and I can tell you the way in which it’s showing up and happening on Clubhouse is drastically different from anything we’ve seen before, because of the audio feature.” While on Facebook there are at least opportunities to find people’s previous comments to respond or report them, on Clubhouse, once a comment is spoken there’s no ability to come back to it.
“There’s no group so far that hasn’t been in there without some tense, stereotypical harassment in some form,” Mitchell said, referring to the minority groups she has tracked — including Jews.
Reached for comment on their policies concerning hate speech, a Clubhouse spokesperson said, “We strongly condemn all forms of racism, antisemitism, hate speech and abuse, and we stand in support of historically marginalized communities.”
The app’s community guidelines say as much and make clear that hate speech is in direct violation of their standards. Violating those rules, they warn, can result in suspension or removal of an account.
With disappearing recordings, tracking hate becomes elusive
To enforce the guidelines, a Clubhouse staffer familiar with the process told The Forward that the platform generates an encrypted recording at the start of each session. If a participant reports a violation while the room is still active, the app retains the audio until an investigation of the incident is completed. If no report of a violation comes in at the time, the recording auto-deletes.
Still, the assumption that a user would submit a violation report as the abuse is ongoing, do so quickly enough before the moderator shuts down the room (which they can do at any time), or file a report at all, leaves experts worried.
More to the point, if a Clubhouse room is full of like-minded neo-Nazis spewing antisemitic hate, there may not be anyone around to report it. In Mitchell’s opinion, there’s “absolutely nothing” the administrators could do to successfully intervene under the current guidelines.
“The way moderation is set up on Clubhouse at present is concerning,” Daniel Kelley, the assistant director of the Anti-Defamation League’s Center for Technology and Society, wrote in a statement to The Forward. In his view, Clubhouse’s decision to record live conversations and only store those recordings for enforcement if an incident is reported during the live conversation “does not center the experience of people targeted by or witnessing antisemitism and hate.”
“Instead, it places an additional burden on them to report in the moment or else lose the chance for the platform to address an incident — until it happens again and harms more people,” Kelley added.
In Mitchell’s view, the onus of moderating abuse ultimately falls on the rooms’ panelists. “The moderation becomes your responsibility,” she said. “If something goes wrong in your room, you’re actually held accountable for your room.”
Marni Loffman, 25, a frequent Jewish user of the platform, said that in their experience, individuals on the app have come together organically to support each other after encountering antisemitic tropes.
“I’ve been in a lot of rooms where people are unpacking and processing really awful things that people have just said about Jews,” Loffman said. “But there’s also been really beautiful spaces where I’ve listened to non-Jews advocating for Jews. There was a room started by a non-Jew to address antisemitism in a room that he had been a part of.”
As the app continues to grow and attract more Jewish community members, experts hope to see it develop a more robust reporting mechanism and systems for handling mistreatment of the platform. Mitchell said the current approach is “unsustainable.”
“If Clubhouse is to continue to grow,” Kelley wrote, “it must devote significantly more resources toward centering its trust and safety efforts around the experiences of vulnerable and marginalized communities or else it will follow in the footsteps of other social media platforms.”
The ADL continues to be in discussions with Clubhouse management around these issues, according to Kelley.
Is Clubhouse equipped to handle hate speech and antisemitism?