You Are Big Brother (But That Isn't So Bad)

Marketers Have Mountains of Data That Make Advertising Smarter, but the Government Might Come Calling

By Published on .

Big brother. He's the face of authoritarian government watching every move of every citizen. He's the nemesis in an Apple commercial long held to be a pinnacle of 20th-century advertising.

And, if you're reading these pages, he's likely you.

No, you're not a dictator. But you are watching.

Thanks to technological advances -- and a willing populace -- marketers these days have mountains of consumer data that in previous decades an overeager government could only dream of .

We know you're watching.
We know you're watching. Credit: Harry Sieplinga/HMS Images

Is that so bad? For all the handwringing about online behavioral tracking, there have been precious few examples of nefarious outcomes. Would the increasingly worried government be right to step in? Or would that stifle the very "big data" innovation that marketers increasingly rely on?

Collecting data about customers is virtually as old as marketing itself, but the trillions of data points now available online make it a sophisticated piece of weaponry.

Marketers can map a consumer's journey across the web and potentially even augment their findings with Facebook data collected by apps that will tell people what

"Hunger Games" character they're most like. Advertisers can enlist the services of a startup such as Tapad, which can follow users onto their mobile devices and tablets. Traditional data brokers sell offline data culled from public records and survey results to marketers, who then can overlay it with their purchase data and the data they've already mined online.

It's not very hard to see why this would be appealing to the CMO who doesn't think that the spray-and-pray of expensive TV advertising aimed at a massive but undefined audience is the best use of money.

"If you look at the difference between spending $300,000 to buy a 30-second spot on "Dancing with the Stars' vs. spending $300,000 on very targeted and measurable digital campaigns -- given the proliferation of data and what you know about your customers -- I think you have a much greater shot of doing the latter," said a chief marketing officer at a global company.

"Big data" is big business for governments, too. Cities such as Colorado Springs are pitching themselves as data hubs and offering tax breaks to companies that move their operations there. Colorado Springs, Colo., is home to data centers for Progressive Insurance, Hewlett-Packard and Verizon Wireless, as well as a number of military bases, the North American Aerospace Defense Command and the Cheyenne Mountain Air Force Station -- the under-the-mountain bunker built to withstand a 30-megaton nuclear blast. Walmart will soon build a 210,000-square-foot data center there and receive at least $4.5 million in tax rebates. But while everyone is doing it, it's not always clear what everyone is doing. For both competitive reasons and because of the uncertainty around the regulatory landscape, companies remain secretive about their forays into Big Data.

That regulatory landscape is neither stable nor predictable. In the U.S., a marketing industry that 's long enjoyed the benefits of self-regulation has so far been given a great deal of leeway. But all it takes is a few bad actors, a series of data catastrophes or a political climate in which embattled legislators see an easy target .

The White House's Consumer Privacy Bill of Rights, a privacy "framework" released in February, states that sites such as search engines, ad networks and social networks that can build detailed profiles of users by tracking surfing habits over time should be most proactive about providing "choice mechanisms" for how data is collected and used.

Released a month later, the Federal Trade Commission's final privacy framework states that data collection should be "consistent with what a consumer might expect; if it is not, they should provide prominent notice and choice."

While the U.S. industry-led self-regulatory program pivots around allowing users to "opt out" of behavioral advertising, the European Union's stance on online privacy is more severe. A "do-not-track" e-privacy directive that says consumers must explicitly "opt in" to cookie tracking has been made law in several countries. Its grace period in the U.K., Europe's biggest ad market, is winding up in May, at which point violators can be fined 500,000 pounds. But it remains to be seen how strictly the Information Commissioner will interpret the law.

Look for marketers and their legal departments to tangle more over what sorts of online data may be collected under privacy policies, especially as Facebook data become more readily available through apps.

At Procter & Gamble's Signal P&G digital event last month, a Venus brand marketer said that the company's privacy policy blocked her team from "leveraging" Facebook's open graph, and wondered if P&G had considered that consumers' clicking on Facebook Connect and other social plug-ins could be interpreted as an implicit opt-in to sharing information about themselves. (P&G's policy is that personally identifiable information must be collected on an opt-in basis.)

"First and foremost we will follow the law, and the EU standards are fairly stringent," P&G CEO Bob McDonald replied. "Hopefully, if our content is good and our ideas are good, people will opt in."

Where's the harm?
Consumers have so far voiced little complaint about ads following them around the web. Though such tracking strikes some as annoying, the argument is often made that it's a consumer benefit.

Think of getting an offer for a $100 Google AdWords credit after searching for digital-ad solutions, or a Zappos VIP membership with free shipping thrown in when you browse shoes on the e-tailer. Better an ad for something a consumer wants than something she considers irrelevant.

While there's a committed camp that thinks tracking is creepy, it's difficult to locate specific instances of alleged harm. Commenting on the FTC's draft privacy report, someone from health-privacy watchdog Patient Privacy Rights said the group has fielded complaints from people who have been served ads for prescription drugs based on searches for medical information that they assumed were private.

Even so, according to the Future of Privacy Forum Director Jules Polonetsky, courts have a 10-year track record of throwing out lawsuits about cookies, online profiling and tracking where the plaintiffs have alleged emotional distress. The grounds? It doesn't meet the bar for harm.

By most accounts, the Digital Advertising Alliance's self-regulatory program to tackle online behavioral advertising has been successful in the 18 months since it launched. The approach has an "AdChoices" icon appearing within an ad to inform users when ads have trackers in them and to offer an opt-out. Evidence of its success is the FTC's noncommittal but consistent recent statements to the effect that it won't seek legislation to address online behavioral advertising.

But even some of the most fervent online-privacy researchers said they don't see behavioral advertising as the culprit: It's overt, and savvy users might recognize a pattern after they've, for example, done web searches for "Mexico vacations" and get an ad for a Cancun resort.

Now the DAA says it's working to implement the FTC's vision of a broader "do not collect" vision that will enable users to opt out of data collection by third parties -- the web of ad networks and analytic providers that are generally invisible to consumers. (The DAA issued its "Self-Regulatory Principles for Multi-Site Data" in November.) The DAA's general counsel, Stu Ingis, said the goal is for the existing icon to facilitate that opt-out by the end of this year and for browser companies to have a setting that enables the same thing.

But Stanford privacy researcher Jonathan Mayer said DAA exceptions granted for third parties that are collecting data for market research or product development are giving the industry too wide a berth.

And however many people opt out, reams of data are still collected from those who don't, which Mr. Mayer and other privacy advocates say will inevitably lead to disaster. Malicious employees leaking customer information or hacker break-ins may be "low-likelihood" events, he said, but in the aggregate, "it starts to feel like this isn't a couple of computer-security guys who are superparanoid and thinking of worst-case scenarios. It starts feeling like "Not if, but when.'" Besides, online tracking isn't the only way to arrive at potentially harmful practices. In a lengthy New York Times Magazine piece about behavioral targeting published in February, the anecdote that helped send the story viral involved an angry father storming into his local Target after his teenage daughter received coupons for maternity items. The retailer's data-crunchers had embarked on a grand experiment to figure out which consumers were pregnant. That data came from shoppers' purchase histories, not from online snooping.

And the more conspiratorially minded will point out that the traditional idea of Big Brother is the bigger issue, as government agencies rifle through the data collected by marketers. In 2009, published a story about an FBI data-mining system that contained "tens of thousands of records from private corporate data-bases, including car-rental companies, large hotel chains and at least one national department store."

Mobile: the Wild West?
Ultimately, mobile might be the sector that drives regulators to act. As more people switch to smartphones, and those phones sync up address books and data from various realms of a consumer's life, including location, the potential for harm increases.

The mobile sector is also providing some great horror stories. New reports of apps that have collected personal data unbeknownst to their users are surfacing every few weeks. The most notorious example this year was when the well-capitalized Silicon Valley darling Path was caught dipping into users' address books. There are also abundant examples of apps' using data for seemingly questionable, if not nefarious, purposes. Take "Girls Around Me," which was aggregating Foursquare and Facebook data to inform users about eligible females in their vicinity. Foursquare shut off the app's API access earlier this month, citing a violation of its policy against aggregating check-in data across venues; Apple then removed it from the App Store. But given the growing ecosystem of apps plugging into its API, Foursquare might need to take a more proactive regulatory role among developers if it wants to avoid being dragged into the headlines. (Foursquare declined to comment.)

No legislation to address the mobile-app ecosystem has been proposed, but it's clear that the FTC is marshaling its forces to regulate the mobile-app space. Its first case came in August, when it fined W3 Innovations -- which developed and distributed mobile apps such as Emily's Girl World and Emily's Dress Up -- $50,000 for violating the Children's Online Privacy Protection Rule by collecting children's email addresses without parental consent.

Some companies in the space are getting the message that they have a grace period to self-regulate before the government makes its presence felt.

The mobile-analytics platform Flurry updated its privacy policy in the fall to reject any app that provides a service for children, for example. And in the ultimate instance of how an industry leader can scramble the market if it wants to, Apple recently began phasing out apps' usage of UDIDs, the unique device identifiers that have been likened to social security numbers for iOS devices. They can be used to track mobile behavior and have been a key ingredient in mobile-ad targeting.

Mobile-marketing companies are now trying to read the tea leaves to figure out what tracking technologies will be left alone by Apple, and being perceived as ahead of the curve might be an opportunity.

Chris Tanner, CEO of Get.It, which enables tracking of app-marketing campaigns, said his board nixed the idea of developing a tracking mechanism that would scrape together available data that 's uniquely associated with a device (a practice called "finger-printing") two years ago because it believed Apple would some day restrict those practices. Instead, Get.It decided to develop a technology based on traditional cookie tracking that could be synced from a mobile web browser to an app.

"We chose cookies because it's safe and it's understood, and even with regulation we don't think it's going to go away," he said.

And what of the notion often upheld by internet companies when regulation is mentioned: that inflexible rule-making will stifle innovation?

It's still very early days, but Union Square Ventures partner Brad Burnham thinks that limiting third parties' access to UDIDs and other data will affect the mobile-app market structure. And if regulation prohibits mobile developers from sharing data with contracted analytics firms and cloud-computing providers altogether, he thinks the effect could be more dramatic.

"We would end up with much less innovative businesses, all vertically integrated," he said.

At the same time, too much emphasis on data-usage notifications could also do damage simply by making users less passionate about mobile products, according to Morgan Reed, executive director of the Association for Competitive Technology, whose core constituents are mobile-app developers.

He mentioned Windows Vista as a cautionary tale, noting that users complained when a new version constantly asked for their permission to proceed.

"I'll tell you right now that consumers want to buy cool apps that do cool things, so that 's why you need to be really careful," he said. "Over-notification is just as big a problem as undernotification."


Do Not Track: A term interpreted differently by online-privacy hawks and ad-industry trade groups. The former want DNT regulation to stop -- or at least to curb -- the practice of collecting data that can be used to build user profiles. The latter hold that collecting data is necessary and benign, as well as critical for functions like fraud detection.

Do Not Collect: The notion that consumers have a right -- beyond the right that they have to not see targeted ads -- to not have their data collected, period. The FTC is urging the Digital Advertising Alliance to broaden its self-regulatory program (currently focused on online behavioral advertising) to let users opt out of data collection. The DAA has drafted principles that call for enabling some data-collection opt-outs, but that framework includes exceptions, such as product development and market research.

Data Brokers: Companies that buy and sell consumer information. Marketing data behemoths Acxiom and Experian qualify, as do the sites (sometimes used in background checks) that scrape public data and put it online. Privacy advocates say that this cottage industry could have dangerous consequences -- revealing the location of domestic-violence victims, for example. Last month the FTC called on Congress to pass legislation that would give consumers access to their data with brokerages.

Opt Out: The notion that web users should be able to withdraw an implied consent to be tracked. This view is the crux of the DAA's self-regulatory program (so far applied to behavioral advertising) and has been implicitly endorsed by the FTC.

Opt In: The notion that web users should give their explicit consent to be tracked; this is the (stricter) view endorsed by the European Union when it amended its e-privacy directive last year. Fifteen out of 27 member countries have passed related legislation.

In this article:
Most Popular