Despite all our enthusiasms for the millennium, I remain a stalwart child of the '80s, the period when media underwent a mass detonation; when for the first time in human history a very large number of devices coincided in a relatively short period of time; when cable TV, personal computers, video-game consoles, VHS and pagers, cellphones and the internet all came to live under single roofs.
How Social Media Stole Your Mind, Took Advertising With It

And so I often find myself immune or perhaps just mildly annoyed by declarations of technology's mounting assault on humanity, of nostalgic cries against Facebook, Foursquare, Xbox, Netflix, HDTV, laptop, tablet, smartphones and e-readers. I have them all, and I use them all, and I like them all. So what's the bother, exactly?
Putting aside, for the moment, what all that input may be doing to our brains, as well as the integrity of the commons, there is the very practical matter of media and marketing. The stakes amount to nothing less than the $151 billion advertisers spent in this country last year trying to get the attention of 308 million citizens -- all of whom appear to be getting more and more distracted by the various media that continue to proliferate under advertising's largess. It's a recursive knot: As advertisers spend more, they extend media's restless tentacles, thus distracting us to the point where marketers have to spend yet more dollars to regain our attention, only then to re-animate media's reach with all its accompanying commotions, and ... are you distracted yet?
The impression is the basic unit of attention that has been sold by media and bought by advertisers for more than 50 years. But in the past decade, something has happened to it; it's not just newspapers, magazines, TV, radio and the web. It's infiltrated every waking moment of our lives through social networks and devices, competing for every last scrap of our cognitive capacity. But we have none left, which is why the impression, and all the economies based on it, may be doomed.
According to a now well-cited study by Stanford University researchers, human cognition, despite its endless plasticity, is actually ill-suited to media multitasking. We're not good at doing too many things at once, and we're more accurate and more attentive when we choose to engage in things selectively, sparingly, even just slowly. Interestingly, the researchers set out to discover if by some chance all of our chronic media consumption might have improved our cognitive abilities.
Sadly, it has not. Clifford Nass and his colleagues at Stanford found that "heavy media multitaskers" were actually slower and less accurate in tests of processing ability, and those who were lighter media users were in fact faster and more accurate at digesting information.
I happen to be a heavy media multitasker, evidenced in part by the fact that my Twitter stream, to pick just one of my daily inputs, is actually multiplied across three different PC readers (HootSuite, Tweet Deck, Twitter.com), plus the two Twitter apps on my iPhone. Don't ask me why. I can only offer that I find a strange solace in being so readily plugged into what has become a steady and necessary media drip. So I wondered, in light of the Stanford study, if my manic hunger for information and entertainment had somehow fractured my nerves into a hot mess, impeding my ability to parse the simplest messages.
Thankfully, the answer was no. But the underlying reason for our media bumbling is actually much more troubling. "There are a few things going on here," Mr. Nass explained. "Some new results we have suggest high media-multitaskers really are seeking simultaneous stimulation -- they'd rather look at new stuff than think about old stuff."
A thousand years ago, at a time when literacy was limited to an elite few, those who, like me, were inclined toward simultaneous data would have had an advantage, according to Mr. Nass. We were keyed into our immediate environments, and our ability to quickly scan and assess the horizon allowed us to signal our tribes to any dangers afoot. But in a media landscape where the environment is a loose and largely disparate collection of abstract data, multitaskers are at a disadvantage.
"Human brains are built to integrate, to expect that the things we see are related to each other," Mr. Nass said. "In a world, however, where you see things that have nothing to do with each other -- namely the media world -- when you look by not focusing, you're missing important things."
While that scenario may appear to be a nightmare for creators of content and advertising, there are certain advantages to this social duality, according to the research. Both those who scan and those who focus on media are equally susceptible to a canned message, but it depends on the form. Scanners like me are more likely to notice, say, advertising on a web page where it's simultaneously competing against the content. As Mr. Nass described, we're unlikely to focus on a single thing for too long, so we anxiously look about the page, taking note of every element. A more traditional and linear ad narrative such as the ones molded by TV and radio commercials decades ago appeals to people who prefer to zero-in on single scenes at a time.
Looking at it this way, digital advertising was unwittingly crafted for the distracted among us. Some of us seek that frenzied mix of image and text and sound with so many links. Take that with a grain of salt. Because for all my datum lust, I am not an explicit fan of digital ads; most likely, no one is. Consider this recent tweet from Foursquare founder and CEO Dennis Crowley: "Dear advertiser who dropped their ad into the middle of the live stream I was watching, I HATE YOU AND WILL NEVER BUY YOUR PRODUCT."
That cry from the commons has exhorted digital publishers to try and make both content and advertising less distracting.
Tim Armstrong, former sales executive at Google and the current CEO of AOL, has during the past year and a half regularly delivered his homily on the need for a new, beautiful web, free of clutter. His company has, during the past year, slowly trimmed its sites of the befuddling ads that typically riddle web pages. As a result, AOL has been hemorrhaging ad dollars, dropping revenue 25% from the previous year to $2.41 billion, a loss of $850 million.

Mr. Armstrong, however, is on a mission. As he told me, "There are just way too many impressions on the web. We're essentially trying to change ... the whole look of the web. It's become a distraction that doesn't benefit anyone." More often that not, the word he preaches is "re-architect," suggesting he's out to change not just how websites look, but how they work. "We believe ads are content," he said. "When ads are good and they work well, they can be as meaningful as the content they're sitting next to."
That's an interesting proposition given that AOL recently acquired a blog-driven content machine, The Huffington Post, for $315 million. Curiously, Editor in Chief Arianna Huffington has been put in charge of all content for the combined entity, suggesting that Mr. Armstrong, while maintaining his position as CEO, will be in the more workaday business of building advertising. A team of sales people, engineers and designers at AOL recently developed a new ad format that, depending on how you look at it, appears to be a site within a site. It's large; it takes up almost half the page; and it contains modules that allow people to do things such as play movies and read product information, even scroll through a full retail catalog.
But will people actually want it? Will they even notice? Before we both say it's far-fetched, consider just how many of the 111 million people who watched the Super Bowl this year cared about the game. It's a spectacle, a sideshow where advertising for once becomes the main event but then again, that lasts just a few hours out of an entire a year, or about 0.04% of the time.
In 2006, Interpublic Group of Cos. set up a media research lab (an increasingly common fixture in the ad world), and one of its recent studies found that consumers are alighting to technology more quickly and more widely, specifically to avoid getting distracted.
According to the lab, 56 million Americans over the age of 18, or about a quarter of the adult population, have regularly avoided watching live TV and have instead opted to watch programming via digital video recorders, the internet, smartphones, or media devices that haul in online video to their big-screen TVs. Of course, the prospect of skipping ads is compelling, but there's a deeper, more innate reason.
Brian Monahan, exec VP at the IPG Media Lab and a former media buyer with Universal McCann, said of the results, "My sense of it is, consumers are wading through all of this stuff, and they're embracing the technology to feed their brains as efficiently as possible," he said. "They want to watch what they want how they when they want it. And guess what: Advertising is low on that list." He went on to offer a gut summation: "I feel like people are looking for more authenticity with their media; they're consuming it ... with more intent; they want it to better represent who they are."
The idea that media should be a mirror and not a guiding authority sits at the heart of the escalating tension between traditional producers and digital inventors. The internet has voiced the desires, the so-called intent of regular folks, and at the heart of that intent lies another company: Google.

Google's real killer application is its database, which mimics the corporal world as closely as possible. Every entry into Google reveals our "intent," and the more we reveal, the more accurately Google can spit back what we want. And if a man named Amit Singhal has his way, Google will eventually give us what we want before we even ask for it -- maybe even before we know we want it.
Mr. Singhal is a 42-year-old native of India and is in charge of Google's core ranking system, making him the one person on the planet who determines on a day-to-day basis which link should sit at the top of a search result. Everyone in the content business, from Demand Media to the humble site editor in charge of "optimizing" for search, would probably kill to pick his brain -- and they'd be surprised at what they'd find.
Mr. Singhal is avuncular and genial, and his dream comes across as equally friendly. He pointed out that information overload is not an issue unique to our multimedia age. The card catalog system, he says, was an early solution to what many thought was the overwhelming problem of a deluge of books being published year after year. In many ways, Google is just the latest system to respond to that dilemma -- a perpetually expanding universe of data. He goes on to say, toward the end of our phone conversation, "We're going to get so good with search, we're almost going to guess your query even before you type it."
Just a few years ago, Nielsen Co. began measuring simultaneous media consumption -- specifically, how many people were watching TV while also surfing the internet, and the results may surprise the wary Luddites among us. In March 2009, 61.5% of Americans were watching TV while also trolling the internet. Exactly a year later, in March 2010, 58.7% of people in this country found the need to multitask the two screens. That fact is more astounding when considering that we're watching more TV than ever, averaging about 35 hours per week in early 2010, vs. 33 a week in 2009. Though Nielsen didn't offer a succinct reason for that drop in two-screen viewing, looking at what people are actually doing while they're clicking through the internet and the remote at the same time may offer a clue.
"It's not disassociated usage," said Pat McDonough, a senior VP at Nielsen. "It's related viewing." Meaning that when people, say, are watching the New York Jets defeat the New England Patriots on TV, they're also clicking through a sports site like ESPN.com to read, rewatch and comment on the highlights.
As a lifelong Yankees fan who attended dozens of games at the old Yankee Stadium, I've often marveled at what has always seemed to be an unusual spectator tradition, and which I've only now come to understand, courtesy of this dual-screen insight. I call them the superfans. They're typically these middle-age men who sit in the bleachers, earphones stuffed into their heads to get the radio announcer's play-by-play despite sitting just a few hundred feet from home plate; at the same time, a portable TV sags in their laps to provide them with replays. Yes, a little crazy, but these guys are actually ahead of their time, and they better illustrate Nielsen's multi-screen data. We need media not just to know the game better, but to live it again, to define what's happening in front of their eyes. We're not blindly plugged in -- we're patently plugged in. We need more media, not less.
The recent Super Bowl was the most-watched in history, with a 30-second ad spot costing $3 million a piece. Amidst a crowd of ads explicitly designed to stand out, Chrysler decided it needed to stand out even more. In our multitasked, ever-distracted age, the Detroit-based company paid $9 million to run a full two-minute commercial, an eternity in ad time and unheard of in a Super Bowl. It was noticed for any number of reasons, but whatever the draw happened to be, it contradicted, however incidentally, the proposition that media moments today need to be forever slimmed to stand out.

"That's what's become really tricky," said Aaron Allen, the creative director at Wieden & Kennedy, which was the agency behind the Chrysler ad. "It's especially tough with social media and interactive and nontraditional, it feels like there are littler messages in more places for people to see."
He pointed out that the long-form ad narrative, which used to be more of a stunt, was less integral to this campaign than the depth of its message, which in this case happened to be a sentimental paean to Detroit's gritty history. "The gimmick will never work," he said. "At the end of the day, if there's no substance behind it, it's going to feel hollow, and no one will care."
But for the 111 million people who witnessed the spot, the message's force was in many ways dependent on the length. Does that mean longer is better? In some cases, the answer is "yes," but it comes from an unlikely forum: Twitter.
In April 2009, Mark Armstrong, who was at the time doing content strategy for an internet startup called Bundle, started to aggregate lengthy articles from outlets such as The New Yorker, Vanity Fair and The Awl, but instead of doing it on a blog, he did it through a hashtag on Twitter: #longreads. "The irony of that hasn't been lost on me," he said of the fact that Twitter limits messages to 140 characters. He has since expanded Longreads to preside on a website (Longreads.com), but it's still driven by the plenum of Twitter followers who "shout out" suggestions using the original hashtag.
"It's a perfect antidote to the current problems with there being so much noise and junk on the open web right now," he said. "The combination of high-quality content that is curated by people you trust, that's a really key combination to this."
Mr. Armstrong pointed out that the emergence of devices such as the iPad and iPhone have actually focused people's media attention rather than distracting them. He cited a study done by another long-form aggregator, Read It Later, headed by his friend, Nate Weiner. "He found that the majority of people were pressing 'Read It Later' all over the place during the workday, where they're essentially in hunter-gatherer mode," he said. "And then there's this spike in traffic with people coming back at 8 p.m. and the weekends -- these are the times when they're more focused."
I happen to be a follower of @Longreads, and despite my qualification as a high media-multitasker, I relish all of its suggestions, which have focused my attentions daily and in many ways provides that media mirror Mr. Monahan described.
My head, by the way, feels fine. Remember, I come from the '80s, when another movie, "Blade Runner," presaged a deeper possibility: that humanity is becoming indistinguishable from the artificial codes that operated the bio-engineered "replicants" running through that film's postmodern fairytale. There may be some truth to the feeling that we humans are becoming more "code-driven," more superpowered, but is that so bad? If it means we can finally take charge of our own magazines, our own websites, our own TV stations, our own books, our own movie theaters, our own ... are you still distracted? Didn't think so.