I read a great book the other day. It was about an alcoholic writer who slowly goes crazy inside this creepy haunted hotel in Colorado, then tries to murder his family. I'm not going to tell you the book's name. But it was turned into a decent movie starring Jack Nicholson. And I'm not going to tell you the author's name. But I can tell you that he's tall, lives in Maine, has written books about vampires, rabid dogs, toddler resurrection, possessed trucks, young boys finding a body, a guy escaping prison and even an entire one about a lady tied to a bed after her husband dies while having sex. The author is a former alcoholic and drug addict himself and was nearly killed once after being hit by a van.
But again, I'm not telling you his name. Because I want to respect his privacy.
Obviously, anyone with even a passing knowledge of American pop culture -- or access to Google -- can figure out exactly who I'm talking about. (And if you can't, I don't want you reading my precious, precious words. Seriously, go read something penned by a "thought-leader.")
This may seem like a tortured way to introduce the topic of data anonymization, but it seemed a lot more polite than leading with this: Data anonymization is a load of horseshit. Data anonymization is also a clever bit of technical and verbal misdirection used by marketers and tech people to keep regulators at bay.
What data anonymization decidedly isn't, by any meaningful definition that has to do with reality, is making consumers and/or their data truly anonymous to marketers and tech people.
As you're well aware, we live in the era of Big Data and its very promise is that marketers can find and target consumers with a precision never before imagined. That's great news for marketers! Also good news is that consumers don't really think about it all that much.
The bad news is that when consumers do think about it, it seems really creepy. (And then they go back to dumping information into Facebook and hitting "OK" when a mobile app asks if it can use location tracking.)
This makes marketers and tech people sad. What they'd like to say to consumers is this: "This isn't any creepier than me digging through your trash and going through all your receipts, prior purchases and take-out bags in order to figure out the perfect present to buy you for Christmas. Besides, you put that trash in those cans voluntarily."
Tech people might actually say that. Marketing people, on the other hand, typically have communications professionals on staff who tell them to pull a concerned, reassuring face and promise to always anonymize consumer data. They hash it. They wave a magic wand over it. They put it under a bunch of cups and move it all around.
The thing is, though, that they can move it back around later and match a little of this and a little of that and bump this database up against that database and serve you some very specific ads.
But it's still OK because they strip it of personally indentifiable information. Which, in most cases, seems to mean they just take your name or address or email off of it.
You know, sort of like I did with Stephen King and "The Shining" above. Or, in the dumpster-diving example, it's the equivalent of pretending to forget your address.
When my colleague Judann Pollack received a very specific telemarketer phone call about a very specific drug about a very specific medical condition recently, we put data reporter Kate Kaye on the case. What Kate was told by various industry people boiled down to this: 1) It didn't happen. 2) If it did happen, it was the consumer's fault. 3) And it was perfectly fine because we didn't use her name, we used her household.
If those sound like horrible arguments, they are. And the only good argument isn't much better.
We've actually said in these pages before that the best argument marketers can make regarding all of this digital creepery is to explain to consumers that this is how they get relevant advertising, rather than ads for, say, leg-waxing or kitten sweaters, delivered to them.
The one small problem with that argument is that it's basically an admission that there's not a whole lot of actual anonymization going on. You're giving me relevant ads because you know who I am. You might not know my name (but you probably do), but that hardly matters if you know every move I make, every breath I take.
Let's take this example to an unsavory extreme. What if I'm a single guy with an interesting set of health problems. I've been to the doctor and maybe used Web M.D. Does it matter if marketers don't know my name if, when my girlfriend uses my computer to Google a hot new restaurant, all of the ads being served are for herpes medications and ointments for super-persistent hemorrhoids? Or, conversely, it's a few years down the road and the dream of addressable TV has come true and I'm over at a lady friend's house and half the ads on her TV are for yeast-infection treatments and colitis?
Granted, due to technology issues and bad offline data, marketers can't do a lot of this seamlessly. Yet.
And they do have the law -- or lack of laws -- on their side. And they'd like to keep it that way. Every time you bring up the possibility of bad actors, marketers start screaming about self-regulation and noting that it's much more important to pass a data-breach bill. That might sound like the sort of "Look over there!" argument a cartoon character would make before scampering off into the woods.
But you can see why a data-breach bill is so important to marketers. Not because they care about consumer data, but so they can get their lobbyists to craft a federal data-breach law that a) limits their liability when their supposedly secure servers get hacked and b) negates state laws from those commies in California or wherever that might be less forgiving toward corporations.
And while everyone else is all worked up over data-breach regulation, things like an actual data-broker law quietly slip into anonymity.
Ken Wheaton, the managing editor of Advertising Age, writes our Last Word column. His latest novel, "Sweet as Cane, Salty as Tears," was published in 2014.