Behavior Aside, Consumers Do Want Control of Their Privacy

What They Do and What They Say May Not Jibe, But It Matters

By Published on .

An interesting eMarketer report, "The Digital Privacy Dilemma," concludes that while consumers say they don't care about digital privacy, marketers would be unwise to ignore the unease consumers really feel about it.

Lead author Mark Dolliver suggests Americans are self-contradictory about marketers' use of their information. It's a view I also hear in discussions with digital-media practitioners. "People may claim to worry about privacy issues," they say, "but look at what they actually do online ... willingly surrendering personal information for a coupon or in a Facebook discussion. The disconnect between what people say and do shows that policymakers and academics misjudge the extent to which the public really cares about the use of data about them by marketers."

This is an important argument that must be taken seriously. My overall response is that the gap between what people say and do doesn't mean they are two-faced when it comes to privacy. Rather, my research suggests the public's seemingly contradictory behavior is rooted in fundamental difficulties that marketers, publishers and policymakers must confront. Consider:

Americans have only a superficial knowledge about how marketers use data. My national phone surveys going back to 1999 show that the majority of Americans know companies follow them, but they have little understanding of data mining or targeting. They also think the government protects them more than it does regarding the misuse of their information and against price discrimination.

People have a life. It's easy for people in the business to suggest that Americans should use anonymizers and other technologies to protect their digital privacy. But learning the ins and outs of the online world can be complex and time consuming, and people have many other priorities to worry about. Additionally, when going online, whether to Facebook, YouTube or a search engine, most want to follow their needs and then leave. My colleagues and I found that even young adults indicate web wariness when answering survey questions. But when going online, the goal accomplishment and emotional relationship-building may trump rationale calculation.

Privacy policies are unreadable. Have you tried to get through a number of them? Clearly only someone knowledgeable about the nuances and code words of the industry (affiliates, third parties, pixels, beacons and more) could begin to make sense of the points they make.

Privacy policies are often "take it or leave it" propositions. Even if someone tried to plow through the verbiage, that admirable soul would not be rewarded. Most privacy policies are what I call "tough luck" contracts: "You either accept how we use your data, or you leave." Sites' approaches to cookies also fit this pattern, telling visitors the site may not work properly if cookies are disabled. Web executives clearly know it's hard for people to leave sites that play important parts in their lives, so they typically don't feel it's worth their while to give people choices when it comes to sharing data with marketers.

Many of the most prominent digital-marketing actors engage in public doubletalk. Consider how Google told its users about its decision -- controversial with advocacy groups and potentially the FTC -- to link information about their activities across its most-popular services and multiple devices. Perhaps to blunt such criticism, the company emblazoned its search page and other holdings with statements such as "This stuff matters" or "Not the same yada yada." But if you clicked Google's link to learn more, the urgency evaporated. The language gave no sense that beginning March 1, to quote the Los Angeles Times, "the only way to turn off the data sharing is to quit Google." Instead, clickers saw the comforting statement that the policy would reflect "our desire to create one beautifully simple and intuitive experience across Google."

Google certainly isn't alone in this two-faced approach. On their landing pages, Amazon and Pandora may suggest their data mining is transparent with respect to visitors, telling what they previously saw of visitors' site behavior. But a trudge through the privacy policies will reveal that this seemingly open approach to visitors' data on the home page actually obscures a far broader and impenetrable use of their data for the company's own and others' marketing purposes. The Digital Advertising Alliance uses a similar tack with the Ad Choices icon. Go to the opt-out area and note the disconnect between the availability of the opt-out choice and the rhetoric there that makes opting out seem slightly absurd.

Built-in conflict. In the complex stew that is American life, we should not understate the built-in conflict between people's need to live in the digital world and their awareness of lack of control or knowledge in that world. My research suggests that Americans are genuinely concerned about this conflict. They worry about it, but they don't know what to do. In coming years, as data retrieval and personalized advertising become more sophisticated in the mobile, commercial and political spheres, those worries will surely increase. Advertisers and publishers who want to build long-term relationships with customers should help people with these fundamental life- management conflicts. They can do it by giving people real control over the information held about them rather than by exploiting their ignorance for short-, or even medium-, term gain.

ABOUT THE AUTHOR
Joseph Turow is Robert Lewis Shayon Professor of Communication at the University of Pennsylvania's Annenberg School for Communication. His newest book, from Yale University Press, is The Daily You: How the New Advertising Industry is Defining Your Identity and Your Worth.
Most Popular