Why Google's 'Time Spent' Metric May Not Be the Best Measure

Marketers Need an Accurate Way to Measure How Their Content Is Performing

By Published on .

Most Popular

In the advertising world, Google is the default for almost everything: search, display, video, ad serving, analytics. The products we know as DoubleClick, YouTube, Google Search and Google Analytics are ubiquitous for a reason -- they're damn good. But over time, damn good can turn into average and eventually degrade into bad. For publishers and marketers relying on Google Analytics "time on page" metric, we've moved past bad toward just plain wrong.

Why does this matter? How could a single flawed metric be cause for alarm? There are two reasons why this is significant to every marketer and publisher right now.

First of all, it's due to the absolute dominance of Google Analytics on the internet. According to BuiltWith, over 70% of .com sites in the U.S. currently use Google Analytics. Google's closest competitor is Facebook Domain Insights at a paltry 3% market share.

Second, as marketing continues its shift from banner ads to content marketing, the metrics we use to quantify success are also shifting, from clicks to attention-based metrics. We trust that our content is performing well because Google is telling us that people are spending time with it.

As co-founder of Pressboard, a content marketplace, this topic hits close to home. Every day we report to brand managers and agencies on how well their sponsored stories are performing across a wide range of online publications. Our reporting data was initially built on top of the Google Analytics API, until we started to see some disturbing inconsistencies. A story about financial planning would be showing 17 minutes as time spent on page, while a very similar story, on a very similar site, would barely crack the 1-minute mark. To better understand these anomalies, we dug through millions of rows of data across hundreds of stories. We soon found out that everything we thought we knew about "time on page" was a myth.

Here's how it really works. Google Analytics is installed on this site, so when you landed on this article, Google marked down the time; let's call that time A. When you're finished reading this article you may head over to another Ad Age article. Google will mark down that time as well; let's call that time B. By subtracting time B from time A, Google knows how long you were on this page. This timestamping method is clearly outlined in its support pages. Sounds pretty simple, right?

Here's where it breaks down. Let's say midway through this article you jump over to another tab in your browser, or decide to finish up that client presentation that's due in an hour, or grab yourself a third cup of coffee. Google has no idea that you've physically taken a break from reading, so the clock keeps running. While this article should only take you a few minutes to read, you may be counted as having been on the page for 20 minutes or more. I'd like to believe that this article is interesting, but it's not that interesting.

Then there's the larger issue of ever increasing bounce rates. "Bounces" are visitors who only view one page on a particular site before "bouncing" to another destination. So if you arrived at this article from Twitter, read it and then decide to head anywhere other than another Ad Age article, you've "bounced." Google Analytics cannot measure any time on page for bounces. With bounce rates on media sites having gone from 20% of visitors in the early 2000s to well over 70% of visitors today, that means that Google has no idea what the vast majority of readers are up to, ever. For the small number of visitors that it can see, the time B - time A method I outlined before is wildly inaccurate.

Fortunately, there are a handful of companies trying to solve this problem. The secret is being able to continuously monitor the reader's attention signals -- mouse movements, touches, scrolling behavior, tab activity. The more you know about the reader between time B and time A, the better positioned you are to know if they are still active -- or if you've lost them for good.

Chartbeat has been a forerunner in providing editorial analytics for media publishers. Sites such as Gawker, Forbes and Time count on Chartbeat analytics to tell them which of their stories are most popular among their readers. Medium has long focused on total time reading as its metric that matters. My company rebuilt its entire analytics engine and weaned ourselves off of the Google API in order to properly measure native content partnerships. If content marketing is going to continue its growth, the way in which it is measured must grow as well.

Moving beyond Google isn't an easy decision but I believe that we've given them enough of our time. Although it seems doubtful they were able to measure it anyhow.