Using 'Fuzzy Analytics' to Accurately Measure TV Ad Performance
When it comes to measurement, most TV advertisers know audience reach, some demographics and probably some level of top-line results. Admittedly, it doesn't match the depth and granularity of data we can get for online campaigns, where we know who's responding to our ads, what they're doing on our websites, how much time they spend there and whether or not they complete a purchase.
Even without that level of detail, most of us truly believe that TV works. But too many TV advertisers concede defeat in the measurement game -- unnecessarily. Sure, when compared with the detailed tracking available for online ads, TV can be a little fuzzy. But that doesn't mean it's useless; far from it.
TV (and offline tracking in general) can provide far more insight than most people think is possible. An understanding of "fuzzy logic" can help us make TV ad measurement nearly as precise as online measurement.
We face the realities of fuzzy logic every day. Look around the office -- is it hot or cold? Phil the receptionist is wearing a sweater. I guess it's cold. But his desk is in front of the main door, and every time someone opens it a gust of cold air from outside slaps him across the face.
Diane, who's got an office on the west side of the building, is wearing short sleeves and is continually wiping sweat from her brow. It must be hot. Of course, it's almost the end of the month and she hasn't met her sales quota. Plus, the sun's shooting darts through her window.
Sometimes the data we gather with our senses doesn't begin to reveal the true complexity of a situation. We have to supplement what we see or hear with context. This is why we have fuzzy logic.
To gather data, and make decisions based on that data, requires a system that can take into account imperfect, incomplete and even subjective information, then provide context that will help us understand the situation.
To deal with the relative ambiguity of TV ad measurement, my agency adapts the concept of fuzzy logic into what we call "fuzzy analytics." Here's how it works: Find a level of tracking we can do, accept its imperfections, gather data, analyze it and improve our ability to understand it as we go. It evolves into a system that is nearly as accurate as following a click online.
Here's an example: Hotwire.com. We've been working with the online discount travel site for about four years, and we've focused primarily on TV.
As is the case with many dot-coms, the name -- Hotwire.com -- is both the brand and the destination to which we're driving consumers. When designing the campaign, building brand was just as important as driving response, so we didn't want to send consumers to a custom URL that would detract from the company name. Since we're not driving to a unique URL or phone number, return on investment is more difficult to parse.
The starting point is to determine what the baseline is: What would we expect the business to do today if there were no advertising running? Hotwire started with a measurement model built on assumptions, and applied new information from day to day, honing the model over time.
Today we have a good sense how our advertising is performing. We know where and when we're getting results, and we can determine the cost of each incremental customer. We're about 90% confident in the decisions we make about how to allocate our budget across networks, shows, days, dayparts, and even our creative mix (including creative length and specific creative executions).
Here are some of the guidelines we follow:
- Know the company intimately. I can't stress enough the importance of high-level access to real information from the client. Get inside access to key company-wide performance indicators. If no advertising were running, how many calls, website visits, or in-person visits would the company receive from prospective customers? From what geographic areas? How many would result in sales? Identify the patterns and trends that already exist.
- Do the work. Gather the data from your campaigns and compare it to your baseline. Analyze it and make the incremental changes as you go. Make note of the peculiarities that may have existed for that time period, and make allowances for those. It will evolve into a highly efficient analytics machine.
- Compare what your performance model is telling you with big company-wide metrics like profitability. This may seem obvious, but many professionals get lost in their own analytics and forget the big picture. A spike in sales may lead you to believe the campaign is successful. But if the customer service department is overwhelmed with complaints and overall profitability is down, then perhaps the sales number isn't giving you the full picture. Investigate the anomalies.
- Never burden your potential customers with your tracking. We fall into this trap when we create custom URLs simply to track the campaigns (yourcompany.com/offer6, for example). It's likely to reduce your overall response rate. It also dilutes the brand message and puts the burden on consumers to remember a longer URL simply so the advertiser can measure results.
For every client, the metrics and performance indicators may be different. But here's one clue that the approach works: The Hotwire.com campaign is ROI-positive -- meaning it achieves a positive return on every ad dollar spent. And we only know that because we measure.
Lest you think such a tight focus on measurement only comes at the expense of the brand, look at it this way: An ROI-positive campaign is rapidly scalable, giving us the ability to increase frequency. That's a solid way to build brand awareness. And the proof is in the numbers.
|ABOUT THE AUTHOR|