5 things we learned about Facebook content moderation from a Reddit AMA

Moderating comments for the social media platform can take a serious emotional toll

By Published on .

Credit: iStock

Facebook content moderators are suffering breakdowns, paranoia and mental health issues according to a person who claims to be a moderator for the company, and who hosted an anonymous Reddit "Ask Me Anything" Monday night. The interview and questions took place a day after The Verge published an exposé highlighting the dark underworld of the profession.

Facebook has approximately 15,000 content moderators—most of them employed through third-party contractors—going through posts to filter out some of the most grotesque and horrifying content circulating through the platform, including beheadings, animal abuse and child pornography.

The job leaves some employees suffering from panic attacks and feeling depressed and disturbed by the work, according to The Verge. Employees have reported doing drugs and having sex at work to cope, the article said.

The self-described moderator, who answered questions from Reddit users about doing this work for Facebook and Instagram, characterized The Verge article as "somewhat accurate."

Here are five things we learned from the AMA:

1: Content moderators can read your private messages—but only if they're flagged, the moderator said. (That is, moderators do not have access to personal accounts.) Facebook moderators are alerted to offensive content in private messages by people involved in the personal chats, and also machines that identify blacklisted terms and images. According to the moderator, 75 percent of the content-flagging process is automated through Artificial Intelligence.

2: Hate speech is being posted faster than employees can keep up, according to the moderator. They claimed to read posts containing the N-word up to 150 times a day and posts with violent speech about 50 times a day—and that's just one moderator. "The platform is way too big to properly moderate," the moderator said.

3: Animal abuse apparently isn't heinous enough to violate Facebook's content code of ethics. "Animal abuse stays up," according to the moderator. However, if you're unfortunate enough to come across such a post, the platforms will give you a warning sign that the post contains graphic content.

4: Employees are suffering from mental health issues and the toll of content moderation has turned some of them into conspiracy theorists, the moderator said: "Moderators end up exposed to crazy theories posted by fringe groups on Facebook. For a certain kind of person it can get in their head and seriously mess them up and cause them to question their reality."

5: On a lighter note, bosses tend to be cool with you smoking weed on the job. "It's really easy for me to get high at work and it allows me to numb through all the racism, homophobia I deal with on a daily basis," the moderator said. "It takes a toll after a while but thankfully the bosses are understandable sometimes and leave you alone while your work. So long as you meet production goals, you can do whatever you want on your side screen."

When asked about the requirements for the job the moderator responded: "Smoke tons of weed. Tolerate racism, homophobia, just about anything. If you can't handle animal abuse or videos of people getting killed, this job ain't for you."

Oh, and you also need a high school degree.

In an emailed statement to Ad Age Facebook said, "We work hard to enforce our Community Standards by having the right policies and the right mix of technology and a global team working on safety and security. We have invested a lot in technology and will continue to do so. Our Community Standards website explains what stays up and what comes down and provides the internal guidelines we use to enforce those standards. We take very seriously our responsibility to support those that review content but also protect the privacy of our users."

Most Popular
In this article: