Facebook has argued that its systems for detecting hate speech are getting better, and releases quarterly reports to demonstrate how much content the service removes. Executives have said that automation is essential because the company has more than 3 billion monthly users around the world using at least one of its products. Improvements in Facebook’s automated systems led to the spike in takedowns, said Guy Rosen, the Facebook vice president focused on integrity.
“The change was largely driven by the increase in proactive detection as driven through the technology that we have been working on,” Rosen said on a call with reporters. In a blog post, Facebook said its technology has been expanded to better catch hate speech in languages other than English, including Spanish, Arabic and Burmese.
But auditors worry that this approach can lead to blind spots if humans don’t review posts that may be tricky or nuanced, such as those with misleading information about voting. The civil rights audit released in July found “most posts reported as ‘voter interference’ are not sent to human content reviewers.”
Facebook didn’t provide the number of posts removed for inaccuracies or falsehoods about voting, but it has started to label all messages about voting with a link to more information. The company also said it has taken down more than 7 million posts on Facebook and Instagram, its photo-sharing app, for spreading misinformation about the COVID-19 coronavirus during the period from April to June, and labeled another 98 million posts on the main social network. The company said Tuesday that it will submit to a third-party audit of these numbers starting in 2021.
Nefarious activity
Facebook is still used by some for nefarious activity despite the company’s efforts. NBC News reported on Monday that Facebook private Groups and Pages dedicated to the far-right QAnon conspiracy group have millions of followers on the platform. The company’s community standards enforcement report released Tuesday didn’t mention QAnon.
While not commenting specifically on the NBC report, Monika Bickert, Facebook’s global head of policy management, said Tuesday that Facebook has removed QAnon pages and groups in the past, and will “keep looking at other ways for making sure we are addressing that content.”
Also on Tuesday, top Democratic lawmakers on the U.S. House Energy & Commerce Committee, which oversees technology policy, urged members of Facebook’s new Oversight Board for content to press for changes to moderation policies, and to resign if Facebook refuses to listen.
Representative Frank Pallone, who chairs the panel, and two other lawmakers lamented the board’s responsibility to interpret existing policies, rather than recommend rules to tackle divisive and conspiratorial content.
“For that reason, we believe the Oversight Board will be unable to address the damage Facebook is inflicting on society unless Facebook itself amends its content policies or empowers a truly independent Oversight Board to render binding decisions that cannot be overruled by Mark Zuckerberg or his subordinates,” Pallone, of New Jersey, and Representatives Jan Schakowsky of Illinois and Mike Doyle of Pennsylvania, wrote in a letter to the board.
Facebook has said that the Oversight Board, which is not yet operational, will make final content decisions on posts that have been removed from the company’s platforms, but disputed.
—Bloomberg News