Similar to the general election two years ago, the company, which owns Facebook and Instagram, will remove posts that mislead people on where, when and how to vote, or that call for violence based on the voting or election outcome, according to a statement Tuesday. Ads that push people not to vote or that question the legitimacy of an election will be removed.
Meta is working with 10 outside fact-checking partners, including five Spanish-language organizations, to review posts and label them if they’re misleading. The company will also prompt users to check out a section of its site with general voting information, curated by Meta employees.
A key part of Meta’s current approach relates to political advertising. In the week prior to the election, no new ads on political or social issues can run—just like in 2020—though the company reviewed and revised the policy. Marketers won’t be able to change any of the ad design or audience targeting parameters during the week before the vote.
“The one thing we wanted to avoid was carrying the highly contentious, inflammatory ads at the last minute, which can’t then be contested,” Clegg said.
Meta has a controversial policy of not fact-checking political ads. The company has debated internally on how to handle political ads for years, with criticism flaring after the 2016 US election when Facebook and Instagram unwittingly sold ads to Russian trolls trying to sow discord among US voters. Facebook also came under fire during the Jan. 6, 2021 insurrection, which was organized in part on Facebook and fueled by false claims of an illegitimate election.
Meta has considered blocking political ads entirely—an approach used in some European elections—though reaction to the idea was mixed. The US electoral system gives paid speech “a very particular status,” Clegg said.