The hearing, which aimed at getting to the bottom of Facebook's approach to users' data privacy, as well as whether it knew about the disinformation its users were spreading during elections, was attended by lawmakers from the U.K., Canada, Ireland, Brazil, Argentina, Singapore, Belgium, France and Latvia. Facebook CEO Mark Zuckerberg was asked to attend the hearing – dubbed "The International Committee on Disinformation" – but declined the invitation, which lawmakers did not hold back in letting him know they found unacceptable. Richard Allan, Facebook's VP of public policy, appeared in Zuckerberg's place.
One of the key lines of inquiry at the hearing was centered on Pikinis, an app that was created by the company Six4Three. It briefly lived on Facebook in 2014 and it allowed users to quickly surface images of their friends wearing bikinis. This meant the company had to have access to friends of the people who used its software. Facebook caught wind of it, took the app down and was later sued by Six4Three for allegedly destroying its business.
Ted Kramer, CEO of Six4Three, was in London last week and was compelled by British lawmakers to hand over a cache of emails – some of which included responses from Zuckerberg himself – that were related to his legal dispute with Facebook. In a statement, Damian Collins, the British politician who executed the order to force Six4Three to hand over the emails, said: "We are in uncharted territory. This is an unprecedented move but it's an unprecedented situation. We've failed to get answers from Facebook and we believe the documents contain information of very high public interest."
The emails were not publicly released, but Collins previously suggested he had the legal power to make them available.
Still, during Tuesday's hearing Collins did reveal one detail from the emails, saying that the documents suggested that an engineer at Facebook notified the company in October 2014 that entities with Russian IP addresses were pulling three billion data points a day from Facebook users. Allan declined to answer any questions regarding the cache of emails, only saying, "Any information that you have seen in that cache is at best partial and potentially misleading."
Here, highlights from lawmakers' questions to Facebook. Some of the comments have been lightly edited for clarity.
Nele Lijnen, Belgium: "Do you know the expression, 'Sending your cats?'"
Lijnen: "Well, I'm from Belgium and in my language it means not showing up. So dear colleagues, we can say Mark Zuckerberg sent his cats. Earlier you said that Facebook wants to get out of being confrontational [with lawmakers]. Do you think Zuckerberg sending his cat, or not showing up today, gets out of being confrontational?"
Allan: "I hope I am able to assist as a cat."
Brendan O'Hara, U.K.: "Were you sent because you, in the entire Facebook empire, are the best person to answer all these questions, or because you're best placed to defend the company?"
Allan: "I'm sent because I can answer your questions."
O'Hara: "Who decided you were the best person to come here?"
Allan: "I volunteered." (Crowd laughs)
Edwin Tong, Singapore: "In March of this year, someone wrote a post that translates to, 'Kill all Muslims. Don't even let an infant escape.' Would that be hate speech?"
Allan: :"Yes, that would be hate speech."
Tong: "It was put up at a time when there was tensions between [the people of Sri Lanka] and Muslims. There were deaths. The government declared a state of emergency. Putting up such a post would travel far and divide the community, yes?"
Allan: "That would be high priority content for us to remove."
Tong: "One of your users flagged that content. Why was it that Facebook refused to take it down?"
Allan: "There are two possible reasons why it wasn't taken down and one of it is simple error."
Tong: "Let me stop you there. You will see that Facebook's response reads, 'Thank you for the report. You did the right thing. But it doesn't go against one of our specific community standards.'"
Allan: "That was a mistake."
Tong: "Would you accept that this case illustrates that Facebook cannot be trusted to make the assessment as to what can appear on its platform?"
Allan: "No, we make mistakes. Our goal is to reduce those mistakes and it is why we are investing heavily in AI, where we create a dictionary of hate speech terms" so content such as this does not appear."
Paul Farrelly, U.K.: In regards to the emails, Farrelly asked,"What has Facebook got to hide?"
Allan: "I don't think we have a public duty to put into the public domain all of our internal discussions around an issue. I think it is appropriate for us to decide how we settle that issue. I don't think it is reasonable to share any more than you would with your own internal discussions that could be quite robust."
Farrelly: "So you have nothing to hide?"
Allan: "In terms of what we did, no. In terms of whether we are comfortable airing all those internal conversations and all those robust comments that people made in those conversations and have them treated as the company's official position, no, I do not think that is fair."
Farrelly: "Do you accept that Facebook needs to be regulated?"
Allan: "Yes. To the extent that you set the rules for your elections and we have a simple playbook to follow; that would be extremely helpful."