The 2015 upfront season featured a volley of swipes by TV execs at digital video's longstanding measurement issues. TV's recent declines in viewership of historic proportions were glossed over, but that's neither here nor there. The fact of the matter is, they are not wrong.
We're an industry looking to a dozen different metrics and signals to validate our video and editorial decisions, and in some instances we're even creating our own. However, video -- video of real substance and entertainment value -- is often expensive to create and to this day there is still no definitive measurement for sussing out business impact: Did this drive sales? Did this improve brand lift? Was this able to shorten someone's decision funnel? Did this produce meaningful media value?
I recently moderated workshops around this very topic: How should we be measuring the impact of our content investments? I posed the question to marketing execs, agencies and other media professonals. Here are six top metrics (in order of importance), and the challenges in tracking them:
1. Mind share: Did this inspire general awareness, popularity and cultural cachet among the audience you're going after?
Measuring mind share is an absolutely subjective exercise. Unlike definitively knowing how many people watch your video or engage with your post, you're depending on the opinions of a data sample, the quality and extensiveness of which will always be in question. In most cases, mind share is manually measured by surveys or by tediously analyzing the tone of a sample of press and social posts.
What is arguably the most important metric for valuing the impact of a marketing investment of any kind is measured by what is unarguably the least scientific or definitive of methodologies.
2. Revenue: Did your content investment deliver positive impact on your P&L?
To illustrate some of the hurdles, I'll use a popular measurement firm like Datalogix as an example. Datalogix, which was recently acquired by Oracle, is a company that helps brands merge the digital and physical habits of their customers by working with platforms, publishers and retailers to cross-reference ad exposure to brick-and-mortar purchases.
This is great in theory, but video content is not delivered or experienced the same way as digital ad campaigns, and these kinds of methodologies often rely on cues like use of loyalty cards or exit surveys, the participation of both the customer and the retailer, and their accuracy depends on the size of the survey sample.
3. Brand lift: Was this something that effected a positive change in important consumer decision signals like purchase intent, preference and likeability?
Like mind share, brand lift is measured using a data sample -- most often a relatively small panel of people enlisted by measurement companies like Nielsen or Vizu. The people may or may not be current customers and all have varying degrees of familiarity with your brand or product.
Unlike mind share, brand lift is a stat that can more easily be sized up against a category or a competitor, making measuring effectiveness a bit easier.
4. Engagement: Was this something someone liked so much they not only watched it, but they shared it, too? The most important engagement metrics identified in the workshops included retention rates, shares, comments and likes.
Where these become difficult to track is in the sheer number of metrics we now equate with "engagement." In the social definition, these can be retweets or likes. However, the definition that is growing in popularity -- the one that is arguably much more important than the number of times something was mentioned, is retention -- could someone even be bothered to watch the whole way through?
5. Reach: How many people theoretically saw this? In video terms this is most popularly quantified as video views or how many times your content is "delivered."
A "view" surprisingly has many definitions. Some video players or platforms define a view as at least a second watched; some are three seconds or more; some cull passive views and some don't. A lack of standardization is why this can be difficult to track.
6. Viewability: This one is so hot right now. Was the video played in-view and above the fold, and did people actually see it?
Similar to the many definitions of a video view, viewability also suffers from a lack of agreement, making it one of the most highly subjective metrics at the moment.
Is a video a video if it is not physically viewable on the page? To most brands and agencies, no. But is a video a video if it starts automatically or if the audio is muted when it plays, or if less than 70% of the video plays were in view while the other 30% weren't? Many people think not.