A subreddit is what Reddit calls its user-run communities, and there are more than 100,000 active subreddits. Moderators who create the communities typically set the rules for the forums and enforce individualized policies, but Reddit administrators have the power to step in when they break sitewide rules.
Reddit explained the action it was taking in a public announcement on Monday:
All communities on Reddit must abide by our content policy in good faith. We banned r/The_Donald because it has not done so, despite every opportunity. The community has consistently hosted and upvoted more rule-breaking content than average (Rule 1), antagonized us and other communities (Rules 2 and 8), and its mods have refused to meet our most basic expectations. Until now, we’ve worked in good faith to help them preserve the community as a space for its users—through warnings, mod changes, quarantining, and more.
Though smaller, r/ChapoTrapHouse was banned for similar reasons: They consistently host rule-breaking content and their mods have demonstrated no intention of reining in their community.
Reddit said "rule 1" states that communities and users are forbidden from promoting hate based on people's identities or vulnerable characteristics. "Rule 2" deals with the need for communities to adhere to the site's policy that users post in good faith, with "authentic, personal interest," and don't abuse that policy to spam Reddit or engage in attacks on other subreddits.
The_Donald has been a provocative community for years, and especially so during the 2016 election. Other communities had complained that members of The_Donald promoted offensive speech and coordinated to manipulate other forums. On Reddit, posts are upvoted and downvoted by users, so people with the most influence can drive the most popular posts to the front page of the site.
Last year, Reddit had taken measures to "quarantine" The_Donald, which isolated its users from the rest of the site.
Reddit has been trying to clean up its act after years of having a more-permissive attitude that mostly allowed communities to run themselves. Recently, sitewide administrators have been taking a firmer approach, as Reddit tries to become a more mainstream destination.
Reddit has been taking stock of its policies over the past month, since the killing of George Floyd in Minneapolis sparked worldwide protests. Reddit has been called out, among other internet rivals like Facebook, YouTube and Twitter, for allowing a subculture of hatred to circulate under the more friendly parts of the site.
Earlier this month, co-founder Alexis Ohanian resigned from the board of directors with a message that Reddit needed to do more to stamp out hatred. Ohanian also called on Reddit to replace him with a Black leader, which it did. CEO Steve Huffman took that recommendation and appointed Michael Seibel, CEO of startup incubator Y Combinator.
Reddit has about 430 million monthly visitors and considers itself one of the most popular websites in the world. The company is trying to appeal more to mainstream advertisers, and this year it picked R/GA as its agency of record to help craft its brand image in a marketing campaign.
Advertisers, however, have been concerned about supporting dark digital environments, and there has been a movement among major companies to hold platforms more accountable for the content spread within their domains.
Facebook is perhaps the biggest example of a company facing that pressure. A brand boycott is brewing for July, organized by civil rights groups that want it to do more to stop disinformation and hate.
Trump has become an easy target for internet companies looking to prove they are tough on offensive speech. In recent weeks, Snapchat demoted the Trump campaign's account, kicking it off its media center called Discover, because of remarks Trump made online about "shooting" protesters. Twitter punished Trump's personal account, too. On Friday, Facebook announced new policies that would give it the ability to put warning labels on offensive messages, which was crafted with officials like Trump in mind.
On Monday, Reddit said the purge of 2,000 communities mostly involved inactive forums. It also said the move was just the start of its mission to craft more rules around hate speech.
"Ultimately, it’s our responsibility to support our communities by taking stronger action against those who try to weaponize parts of Reddit against other people," the company said. "In the near term, this support will translate into some of the product work we discussed with mods. But it starts with dealing squarely with the hate we can mitigate today through our policies and enforcement."