The marketing industry has no shortage of data scientists, many of whom are intimately familiar with Facebook data and how it can be used. Indeed, employing Facebook to conduct simple A/B tests of ad creative and messaging has become commonplace for all sorts of advertisers pitching anything from weight loss products to political candidates.
But what happens when Facebook itself does a little A/B testing with users and their posts as part of a psychological experiment?
A study conducted by Facebook's data scientists that recently surfaced in an academic publication sparked a firestorm among fellow academic researchers. Facebook's researchers set out to gauge the impact of positive and negative posts on around 700,000 users in a study conducted in January 2012.
Facebook's data crunchers used common data-parsing systems to filter posts it would include in the experiment, and researchers didn't review any of the post content themselves. "As such, it was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research," said the research team in a brief on the study.
Essentially, the Facebook researchers decided the site's terms and conditions enabled them to conduct the experiment without notifying the people whose news feeds were altered for the study.
The key takeaway, as far as Facebook's data folks were concerned: "emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness." In other words, when shown more positive posts, people were more positive in their own subsequent posts. The same effect occurred with negative posts.
Some in the media and academic research community, however, argued that the Facebook data study was unfairly manipulative of the emotions of unwitting users.
Ad Age asked a few data scientists in adland to weigh-in on Facebook's research study. Several declined to comment. Here's what the others said:
Manipulation Is in the Eye of the Beholder
"I'll be generous and assume that people that are mad at this are also mad at Barack Obama for 'manipulating' them out of an extra $500 million via A/B fundraising tests."
- Alex Lundry, senior VP and chief data scientist for TargetPoint Consulting, a Republican market research firm
Make Studies Opt-in and Give Participants Data Control (and Cash)
"Most marketing research is opt-in, which is the main issue with this Facebook research, not the delivery mechanism (algorithm-based optimization). Such programs should not just be opt-out but truly opt-in only. Moreover, my results should be shared with me. If I don't feel good about my results becoming a part of the overall pool, I should also be allowed to withdraw my data at any time.
"For the advancement of science, we need to run more of such experiments but it has to be done in a controlled fashion -- in a way that does not prove harmful to anybody participating. At the same time, people who contribute to this advancement, just like in a clinical trial, should also be compensated in the long run (it's their data), not just tech companies like Facebook."
- Puneet Mehta, co-founder and CEO of MobileROI, a mobile cloud data company
A/B Testing for Ads Without Notification is A-OK
"No! [Marketers should not have to notify people if they're doing A/B testing for ad campaigns]. If they did, they should also notify people when they have implemented the learnings from the testing. Should advertisers running display advertising not be allowed to run four different versions the first week, and thereafter run with the 'winner' of the four the rest of the campaign period? Should the target audience have been advised about the fact that they are now only seeing one of four very different banner ads?"
- Jimmy Schougaard, Head of Strategy, OgilvyOne North America
Facebook Needs to Show Social Data Scientists It's Serious