Facebook just revealed the rules for its independent content board
The board will make irreversible decisions about what posts stay up and come down
The board will make irreversible decisions about what posts stay up and come down, even if the company disagrees.
Facebook, ahead of a congressional hearing on violent content, revealed the charter for an independent oversight board that will make irreversible decisions about what posts stay up and come down, even if the company disagrees.
The board, which Facebook started talking about in January and which will begin to hear cases early next year, represents the first real check on Facebook’s power to decide who gets a voice on its site. Its members—at least 11 people at any given time and fully staffed at 40—will be the final word on controversial cases that affect Facebook’s 2.7 billion users. The board’s charter outlines a vision that is easier said than done.
The members will “exhibit a broad range of knowledge, competencies, diversity and expertise” with no “actual or perceived” conflicts of interest that would affect their decisions on user content, according to the charter revealed Tuesday. They will “collaborate in decision-making to foster an environment of collegiality, and issue principled decisions and policy recommendations using clearly articulated reasoning.” The committee deciding on cases will include one member from the region of the post in dispute.
Facebook spent months deliberating with outside experts to ensure the board acts independently, even though members are paid indirectly by the tech giant. Funding is channeled through a trust and the trustees can’t fire board members if they make bad content decisions, only if their conduct is poor. At stake is the trust of Facebook’s users, who sometimes don’t understand why posts are removed, or why questionable content they report remains online.
The company is also dealing with increasingly damaging types of content, like posts to recruit terrorists or influence elections. On Wednesday, executives from Facebook, Twitter and Google will testify before a Senate committee on violent content and extremism, after a string of mass shootings, some of which were broadcast live on social media.
Kate Klonick, an assistant professor at St. John’s University Law School, has been embedded at Facebook to observe the oversight board’s creation, including sitting in on meetings with staff. She describes a notable update: The board can provide feedback on Facebook policies, and the company will review that and write a public statement explaining why it did or did not change a policy as a result.
“That’s actually kind of a huge deal,” Klonick says. “That’s probably the most accountable we’ve ever seen Facebook.”
There are still elements that are unclear, according to Klonick. The charter references “bylaws”—the “operational procedures of the board”—and a Code of Conduct outlining the “norms, procedures, and proper practices” expected of board members. Neither exists right now, but both will be important to start the board off in the right direction with the right set of principles, she says.