The following exchange took place on Twitch in April: “The question is are white people being replaced in America and if so is that a bad thing,” a Twitch moderator said. A Twitch streamer chimed in: “That’s what it seems like to me, a lot of immigration, no matter how many large, large, groups of Americans gather to push back against it … Why would they want to force us to allow foreigners in against our will.”
They were discussing the “Great Replacement," a theory that suggests there is a plot in America and other Western countries to limit the procreation of white people and allow Black and brown immigrants to grow in numbers. Those who believe this often blame Jewish people for masterminding the plan.
This particular exchange on Twitch, which appeared in the platform’s “Just Chatting” category of videos, was preceded by pre-roll advertisements from Hyundai, Dr Pepper, Discover, Meta and Doordash, advertisers on Twitch that likely did not know their ads would appear before a debate about the conspiracy theory that veered into racist fearmongering. The “Great Replacement” has now become the latest brand safety concern related to social media as the subject was in the news for inspiring a radicalized gunman who killed 10 people in Buffalo, New York, over the weekend. Most of the victims were Black. The shooter left a racist-filled manifesto behind.
The shooting was livestreamed for nearly two minutes on Twitch, before being shut down. Video clips made their way to other social platforms such as Meta’s Facebook. The aftermath of the mass shooting has sparked a new round of soul searching in the worlds of media and social media about the responsibility of platforms to police their services, to tone down violent rhetoric, and the role brands play in inadvertently funding some of this content.
Twitch, which Amazon acquired in 2014 for $1 billion, features plenty of streams that touch on controversial issues in the “Just Chatting” category.
“Brands do need to take this into account,” said Kayla Gogarty, associate research director at Media Matters for America, a media watchdog group. “With their funding, their advertising money, they have the ability to hold … platforms accountable. We continue to get empty promises. They keep telling us they’re doing what they can to stop misinformation, extremism, and yet still they have these horrible incidents happen.”
Videos of the Buffalo shooting were clipped and sent to Facebook feeds, Gogarty said, generating potentially millions of views. A Meta spokesperson said that the platform "quickly designated the event as a violent terrorist attack" and removed the perpetrator's account, as well as links to a "manifesto" and videos. The shooter claimed he was radicalized “by infographics, shitposts, and memes,” in the manifesto that went online. He also allegedly had a YouTube account.