Mark Zuckerberg will testify before Congress on April 11. It's going to be a big day for Zuckerberg. It's going to be an even bigger day regarding the future of data-driven advertising, data-driven marketing, and data-driven decision making. Because of Facebook's scale, Zuckerberg's testimony is likely to have an impact on how much (and what kind of) data we can use to train AI systems and statistical machine learning models going forward. (See my column, "All Your Data Are Belong to Us").
April 11 may mark the beginning of a governmental war on progress we haven't seen in modern times. After all, the government won't just regulate Facebook; it will regulate the collection and use of data writ large.
Sweeping data collection and use regulation are probably necessary, but government regulation cannot be written as a knee-jerk reaction to third-party abuse of a four-year-old, business-to-business policy Facebook shut down three years ago. (See: "What Facebook Data Did They Get and What Did They Do?")
Make no mistake: I am not a Facebook apologist. Facebook got itself into this mess, and it will have to get itself out of it. That said, I could not resist the temptation to imagine what I might do if I were in Facebook's shoes. So, as a starting point for your own "If I were Facebook" fantasy congressional hearing, here's my list of the five things Facebook must do right now. (We can deal with fake news, tribalism, confirmation bias, social media addiction and other issues another time.)
1. Make a data privacy button
Build and deploy a "Make my data private now" button and pin it to the top of everyone's newsfeed for the foreseeable future. Under that button, explain what "Make my data private now" will mean in simple, easy-to-understand terms. For example: You will still see ads, you will still have a newsfeed, but nothing about you will be shared with any company outside of Facebook. You, our beloved and cherished Facebook user, will need to find other ways to login to all the apps you used Facebook Login to sign up for. If you press this button, no third parties will be able to access any of your data until you personally change your settings, which you are under no obligation to do—ever.
2. Simplify profile settings
Then, right under the explanation of the "Make my data private now" button, put a smaller button that says: "Learn about Other Privacy Options." This button will take you to a landing page with two super-simple options: "Stuff I'm Willing to Share with My Friends" and "Data I'm Willing to Share with Advertisers."
The "Stuff I'm Willing to Share with My Friends" is self-explanatory. These settings can be more detailed, but should be simpler than they are now. These are public settings (not privacy settings) and should be clearly labeled as such. "Who can see my pictures?" "Who can see my posts?" All of this is on Facebook right now, but it's just buried so deep, you have to spend way too much time to find it.
The "Data I'm Willing to Share with Advertisers" button is nowhere near self-explanatory. These would be real privacy settings. Name, email address, age, gender, location, home address, etc., would have super-granular check boxes that would define what data you would allow Facebook to share with a third party.
3. Actually introduce anonymous login
In 2014, Facebook introduced "Anonymous Login," a "brand new way to log into apps without sharing any personal information from Facebook." It never actually arrived. You must make this as the new default identity toolset for developers and restrict full Facebook login to partners that have been through a vetting process and background checks.
4. Lead by example
Pretend Facebook has already been regulated and will lead (rather than follow). Facebook announced a whole bunch of new policies it is going to implement in the wake of the Cambridge Analytica scandal and ahead of the congressional hearing. (No matter what outside forces inspired these changes, they are all quite good.) But Facebook needs to do more. It must lead the world in responsible innovation. It must lead the world in responsible regulation. It must form a self-regulatory organization with other big data organizations and lead the industry so that government doesn't "try and convict" all data-driven businesses in the court of public opinion.
5. Teach don't preach
Facebook must stop acting like we're all stupid for being afraid of the power it has amassed. That people do not understand the Cambridge Analytica scandal, or what data has been used, or how, should tell you that no one knows anything about what Facebook is doing "for" or "to" them. People fear what they do not understand, and it is human nature to destroy things that you perceive as threatening or dangerous.
Right now, an angry mob is marching through the forest on its way to Menlo Park. It's being led by politicians who believe that Facebook has the power to get them elected (or un-elected), traditional media outlets who hate Facebook for the money it has taken out of their pockets, Facebook users who feel betrayed, Facebook users who are scared, and non-Facebook users who are caught up in general techlash.
The mob is getting bigger every day, and you, Facebook, need to tell that mob, and the rest of us, what you are going to do to help us trust you again.
Author's note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it.