Anti-Defamation League: Twitch should invest in moderation tools

Jessica Conditt
·Senior Editor
·7-min read

When Anti-Defamation League researchers set out to study the harassment of political figures on Twitch, they expected to encounter a breeding ground of live abuse. Previous research out of ADL’s Center for Technology and Society tracked a dramatic rise in online antisemitism and bigotry targeting politicians ahead of the 2020 US presidential election, while another ADL study last year found 81 percent of video game players experienced harassment in online matches. Politicians going live on Twitch, the largest video game streaming site in existence, could easily result in a toxic collision of these two worlds.

“It seemed likely that public events that featured both prominent politicians and video game streamers would become hotbeds of hate and harassment,” ADL wrote in its report, Politics, Gaming Culture, and Content Moderation on Twitch. “But our investigation found that the four prominent political Twitch events we reviewed resulted in primarily positive and inclusive spaces.”

CTS researchers were surprised to encounter relatively low occurrences of abuse in four high-profile political events on Twitch in late 2020 and early 2021. However, this doesn’t mean the platform has eradicated harassment. The CTS also found a live Twitch show hosted by comedian Jimmy Fallon in April 2021 was overrun with negative comments and emotes tied to racism, and it failed to meet its charity fundraising goal by thousands of dollars.

The key difference, according to the CTS, was a team of experienced moderators.

This content is not available due to your privacy preferences.
Update your settings here to see it.

“Politicians and public figures interested in engaging with Twitch should connect with experienced community moderators and researchers familiar with the platform to create positive spaces for discussion and political engagement,” the CTS wrote. “We also recommend that Twitch require public figures to engage experienced community moderators before hosting events, and the company dedicates additional resources, training, and compensation to ensure a robust pipeline of skilled and experienced community moderators help make the platform a more welcoming space for all people.”

For this study, the CTS collected data from the following four political events held on Twitch between October 2020 and January 2021:

  • October 21st, 2020: Democratic New York Rep. Alexandria Ocasio-Cortez, Democratic Minnesota Rep. Ilhan Omar and popular streamers played Among Us in a get-out-the-vote event. There were 225,771 comments on one channel.

  • November 28th, 2020: Ocasio-Cortez, Canadian Member of Parliament Jagmeet Singh and popular streamers played Among Us in a COVID-19 relief effort. There were 283,762 comments on two channels.

  • December 22nd, 2020: Ahead of contentious runoff elections in Georgia, Democratic US Senate candidates Raphael Warnock and Jon Ossoff played Uno with streamers from the state in a get-out-the-vote campaign. There were 6,233 comments on one channel.

  • January 28th, 2021: Ocasio-Cortez, streamer TheStockGuy, Reddit co-founder Alexis Ohanian and economic policy analyst Alexis Goldstein discussed Gamestop stonks. There were 125,796 comments on one channel.

The CTS compared these events to Fallon’s Twitch appearance on April 6th, 2021, where he played Among Us with members of The Roots other famous folks in order to raise money for the charity Feeding America. That stream resulted in 74,242 comments on one channel.

On Fallon’s stream, the most popular chat expression was the TriHard emote, an image with negative connotations that’s been used to harass Black streamers specifically. The TriHard emote was used in 2,541 comments on Fallon’s stream, representing nearly 4 percent of all chat activity, the CTS found.

By contrast, the most popular comments in Ocasio-Cortez’s January 2021 stream about Gamestop were “to the moon” and “hold the line,” benign references to the r/WallStreetBets community. These represented roughly 4 percent of all comments, researchers said.

This content is not available due to your privacy preferences.
Update your settings here to see it.

In Ocasio-Cortez’s October 2020 stream, the most prevalent messages were celebrations of LGBTQIA+ pride, composing 1.16 percent of the chat, according to the CTS. In Fallon’s chat, 1 percent of the conversation focused on his team’s poor attempts at moderation.

Fallon’s stream ended up raising $17,000 of a $25,000 charity goal, and the comedian pledged to donate the remaining cash himself. Meanwhile, Ocasio-Cortez’s November event raised more than $200,000 for coronavirus-related relief organizations.

Here’s how the CTS explained the discrepancies: Fallon employed just three moderators for his stream, and they seemed to be new to the Twitch ecosystem. For her October 2020 stream, Ocasio-Cortez used bots and 14 experienced human moderators, including a handful of Twitch employees. Experienced moderators kept the conversation on track, according to the CTS.

Chat moderators from AOC's October 2020 Twitch event (left) vs. Jimmy Fallon's April 2021 event (right)
Chat moderators from AOC's October 2020 Twitch event (left) vs. Jimmy Fallon's April 2021 event (right)

“In addition to having a deep understanding of how to moderate large live events, community moderators such as the ones employed by Ocasio-Cortez had extensive knowledge of how to best use the tools the platform made available to them,” the CTS said. “It is important to note that the use of platform tools by experienced community moderators to make events on Twitch respectful and inclusive is not something that happens automatically. The political events analyzed here involved extensive work by community moderation teams making numerous context-driven decisions on how to moderate a channel both before and during an event. The consequences of not undertaking this work are evident in the high levels of hate and harassment present in Fallon’s stream.”

The CTS recommends politicians collaborate with experienced moderators and inform themselves about the Twitch community before going live. It also has a few suggestions for Twitch itself:

  • Enforce its newly revised hate and harassment policies consistently and for all streamers, regardless of popularity. To that end, include data in its transparency report about how specific communities are targeted by hate and harassment on the platform.

  • Increase its investment in moderation tools as the platform grows.

  • Require trained moderators for streams featuring prominent public figures.

  • Provide effective tools, support and justice for marginalized streamers, including “specific ways to escalate reports of harassment to the platform and access to experienced community moderators.”

The CTS study was completed before Twitch rolled out its new chat tools, which introduced threaded replies on April 22nd. This fresh feature doesn’t change the CTS’ conclusion that active moderation is key to building positive Twitch communities.

“Twitch chat [can] turn into a haven of hate and harassment if norms aren’t set by the streamer and community moderators aren’t sufficiently active in enforcing those norms in the stream,” Daniel Kelley, Associate Director of the CTS, told Engadget. “While Twitch chat is more ephemeral than other forms of online harassment, the intense spamming of hateful comments and emotes can make a channel harmful for both the streamer and folks using the chat.”

Essentially, the CTS found that successfully moderating a contentious Twitch stream requires constant attention, thick skin and niche knowledge, and it isn’t something that can happen automatically with the company’s current toolset. For many streamers who fall outside of the platform’s white, cisgender, heterosexual male standard, this isn’t news.

Marginalized Twitch streamers have long attempted to raise awareness about persistent abuse in their chat boxes and inadequate support from the company. Third-party sites and Discord groups are popping up to offer the services these users need, such as Peer2Peer.Live, a hub that allows Twitch streamers to build communities around identity-based tags like “trans,” “disabled” or “Black.” Peer2Peer was founded by Steph Loehr, a trans woman who streams under the name FerociouslySteph.

Peer2Peer.Live
Peer2Peer.Live

“The essence is that people of marginalized identities feel safest in spaces that understand them, and the easiest way to find those safe spaces is by finding their peers,” Loehr told Engadget earlier this month. “And that discoverability has been totally blocked by Twitch.”

Loehr happens to be on Twitch’s Safety Advisory Council, which means she has a direct line of communication with the company, and representatives have asked for her input regarding safety on the platform. She still had to launch a separate site to incorporate one of her ideas.

“Twitch has not done enough to protect me in the slightest,” she said.

Twitch, for its part, has said it’s aware that marginalized streamers face disproportionate amounts of abuse on the platform. A representative told Engadget this month that the company remains “committed to making Twitch the safest and most inclusive community it can be.”