Facebook's latest revelations about the 2016 election show just how tough it's going to be for the social network to police its platform.
After some prodding, the company this week gave Congress the ads that it says Russia-based groups appear to have bought to influence last year's U.S. presidential campaign. Facebook, Twitter, Google and others have become a focus of concern since it became clear digital ads and fake news were used in disinformation efforts by foreign actors.
On Monday, Facebook also outlined new details of how it was used by those who bought the ads, which numbered 3,000 over two years at a cost of more than $100,000.
Other than misrepresenting the groups behind them, many of the ads didn't break Facebook's rules regarding content, the company said. If a straightforward buyer had purchased them instead of an allegedly Russian entity masquerading as a variety of U.S. interest groups, they would have largely been OK with Facebook.
"Many of these ads did not violate our content policies," Facebook said in its blog post. "That means that for most of them, if they had been run by authentic individuals, anywhere, they could have remained on the platform."
On Tuesday, The Washington Post reported that the nefarious advertisers used Facebook's custom audiences—sophisticated ad targeting—to try to reach promising targets.
Facebook has promised new measures to vet ad campaigns and purge bad actors, and CEO Mark Zuckerberg even took to the social network over the weekend to apologize for his company's role. Here's what we learned from Facebook's latest disclosures.
What was in them?
The ads were mostly focused on divisive social issues such as immigration, gun control, LGBT rights and race. They were used to get people to follow Facebook pages.
How many people saw them?
Facebook said about 10 million people saw the ads in the U.S., with 44 percent of their ad impressions before the election and the rest after.
Why couldn't they be stopped?
Facebook said 8 million people report ads every day, and it's able to review millions a week manually. For the most part, Facebook is still a mostly automated content machine, but it said it would hire 1,000 more people to review ads, focusing on ones that attempt targeting around topics like cultural affinities.
Why didn't the rubles tip Facebook off?
Facebook said some of the ads were paid for in Russian rubles, but that paying in Russian currency alone is not enough to set off an alarm. Most Russians on Facebook are doing nothing wrong, Facebook said.
Nor does Facebook stop foreign organizations from buying ads in countries outside their homes. "While we may not always agree with the positions of those who would speak on issues here, we believe in their right to do so—just as we believe in the right of Americans to express opinions on issues in other countries," Facebook said in its post.
Is there ever 100 percent "safety"?
No. Facebook said it can't manually review 8 million ads flagged every day. Many of those ads could very well pass content guidelines despite being objectionable to some.
"Even when we have taken all steps to control abuse, there will be political and social content that will appear on our platform that people will find objectionable, and that we will find objectionable," Facebook said in its blog post. "We permit these messages because we share the values of free speech."