Need for Speed Puts Copy Testing to the Test

Brands, Vendors Search for New Ways to Validate Creative Ideas

By Published on .

Credit: Illustration by Stephen Webster

When Pepsi Max dreamed up an ad disguising Jeff Gordon as a normal guy test-driving a Camaro and scaring the crap out of a passenger, the Nascar star wasn't the only speed demon involved in the spot. The PepsiCo marketers behind the ad moved just as fast, including Simon Lowden, chief marketing officer for Pepsi Beverages North America, who green-lit the idea almost as soon as it was presented to him.

"This feels great. It feels right. Let's go," Mr. Lowden recently recalled saying.

The 2013 digital video drew millions of views, cementing its place as a viral hit that is still discussed today. But it's the kind of thing that might've never happened had Pepsi stopped to submit the ad to copy testing.

The episode highlights a growing trend in adland as the speed required in today's content-ravenous digital age puts a harsh spotlight on a process used for decades to validate concepts before they are put into market. Tension over testing, pitting creative agencies against risk-averse marketers, has been around as long as copy testing itself, but it's now been exacerbated by a responsive digital and social environment that requires lightning-quick action. Against a steady drumbeat for more and quicker digital content, even some of the most ROI-obsessed marketers have lost a measure of faith in traditional copy testing methods, or simply don't have time to use them, according to several marketing executives interviewed by Ad Age.

Allstate's Mayhem
Allstate's Mayhem

Top-notch campaigns that have gone live without pretesting include Allstate's "Mayhem" and Old Spice's "The Man Your Man Could Smell Like." And as more marketers rely on gut feel rather than time-consuming consumer surveys while they pump out content on tight time frames, ads are being judged, altered and sometimes expanded after they are in market, using tools like social listening to gauge viewer engagement.

Frito-Lay North America tends to pretest TV commercials and digital ads backed by big media buys, said CMO Ram Krishnan. But more experimental digital marketing efforts—which constitute about 10% of the marketing plan—usually don't get tested. For Twitter and Facebook campaigns, "it's very tough to test just because of the volume of content we are putting out," Mr. Krishnan said.

Doing the right thing?

Copy testing is "built on really outdated thinking on how advertising works, and it's just not valid," said Tom Bick, who recently left Kraft Heinz, where he oversaw advertising for Oscar Mayer. "But it gives you the illusion that you are being a disciplined marketer and it gives you a sense of confidence, be it false, that you are doing the right thing."

Copy testing—which involves running ads by consumer panels before they are put in market—remains a trusted tool for many marketers because it works, say its defenders. "This is the method that has stood the test of time of being predictive of in-market success, and no other method has dethroned the king," said Peter Minnium, a former ad agency executive and newly named U.S. president of Ipsos Connect, a division of market research and ad-testing company Ipsos.

PepsiCo's Simon Lowden
PepsiCo's Simon Lowden

But the king no longer rules with absolute authority. At Pepsi, copy testing is "an important tool in our tool kit," Mr. Lowden said. But it "should never be a means to red-light or green-light work. It's a way to inform and to optimize."

In the face of this skepticism, dominant ad-testing vendors Ipsos and Millward Brown are racing to remain relevant as they overhaul their survey-based methods to deliver the speed marketers crave and to account for modern viewing habits like online ad-skipping. They are also incorporating neurological and biometric techniques that judge an ad with bodily reactions, like facial expressions, meant to gauge emotions.

Surveys-plus approach

Marketers are demanding a "surveys-plus" approach that adds online behavioral data and biometric measures to consumer panel data, said Mr. Minnium, who held leadership positions at Lowe Worldwide and was the head of digital brand initiatives at the Interactive Advertising Bureau before taking the Ipsos job in July. But taking surveys out of that equation is "not something I see happening anytime soon," he said.

Indeed, the Advertising Research Foundation has found that combining traditional copy testing with neurological and biometric methods can "improve the predictive value of tests," said Horst Stipp, the organization's exec VP-research and innovation, global and ad effectiveness. But the hybrid approach "takes more time," he said. And "the market wants faster results, so the vendors are now struggling to find out how to do it faster." The debate will be the focus of a panel discussion during Advertising Week this month called "How Advertising Works: Building Brands in the Brain."

Traditional copy testing, or pretesting, has been used for decades. Modern approaches typically involve showing ads to a panel of consumers who watch them online. Surveys are used to measure attributes such as likability, persuasion and recall. Techniques sometime blend quantitative questions—like ranking an ad's likability on a scale of one to 100—and qualitative techniques, like open-ended questions. Marketers test ads in various stages, including early versions that are shown in crude animated form, known as animatics, all the way through to the finished product. Often, final scores factor in norms like how spots compare with historical ads or ads currently running.

Bringing home the bacon

Some critics say that while copy testing is effective at measuring informational ads, it falls short with emotional spots. "Advertising is about building trust and a feeling about a brand that predisposes people to liking you … that then allows more rational messaging maybe to come through the filter. And most copy tests don't reward you for that," said Mr. Bick, who helped contemporize Oscar Mayer's advertising by launching bacon-inspired digital videos that often went viral.

One of them, called "Say It With Bacon," included a video mocking engagement-ring ads and plugged luxurious bacon gifts that the brand actually put up for sale online. The campaign, which drew 500 million impressions, according to digital agency 360i, was not tested. For digital efforts aimed at winning PR, "we literally used what I fondly called the F-me test," Mr. Bick said. "Is it bold, will it possibly ruffle feathers internally, will consumers say, 'I can't believe they did that'?"

Marketers have used gut feel with great success in some cases, including Allstate's "Mayhem" campaign. "There was a lot of internal pressure to kill it," Lisa Cochrane, former senior VP-marketing, recalled during a presentation at an industry event in 2012. "We didn't do any market testing or focus groups," said Ms. Cochrane, who retired this year. "I just asked myself, 'Would I want to watch those ads?'"

Scott Bedbury, former worldwide advertising director at Nike, told Bloomberg in 2007 that he had an agreement with Wieden & Kennedy founder Dan Wieden "that as long as our hearts beat, we would never pretest a word of copy. It makes you dull. It makes you predictable. It makes you safe."

To that end, the original ad in Old Spice's hit campaign "The Man Your Man Could Smell Like" by W&K also wasn't copy-tested, according to James Moorhead, former brand manager of the Procter & Gamble brand, now senior VP-CMO of Dish Network.

Paint by numbers

W&K wouldn't comment for this story, but the agency has a reputation for resisting copy testing. "For them it feels very constraining when they are still in the process. You are telling an artist to paint by numbers," said Lesya Lysyj, who worked with the shop while CMO at Heineken USA and during a stint as president of Weight Watchers North America.

That position could put W&K at odds with one of its newest clients, Anheuser-Busch InBev, which has a reputation for strict copy-testing standards. The brewer—which awarded Bud Light to the agency this summer—in recent years has put so much faith in copy testing that executives were required to copy-test TV ads for priority brands in order to qualify for bonuses, according to a person familiar with the matter. A-B InBev also declined to comment for this story.

Upstart yogurt brand Chobani does not test any of its ads, said CMO Peter McGuinness, a former ad agency exec who has used many testing methods over the years.

Chobani Ad
Chobani Ad

Mr. McGuinness, who helped launch MasterCard's "Priceless" campaign while at McCann Erickson, recalled that the agency won the pitch even though the original idea for the campaign tested poorly in animatic form. Then-MasterCard marketer Lawrence Flanagan "was courageous enough" to see the potential in the campaign, Mr. McGuinness said. But testing did prove valuable because results revealed that there was not enough branding in the spot, so the marketer made tweaks like adding the brand logo at the beginning.

"I don't like using copy testing to pick creative," Mr. McGuinness said. But "I do think you can use it diagnostically." Still, he added that testing can be expensive and time-consuming and he would rather get a slightly imperfect ad into the market fast, rather than be stuck in the "inertia" of copy analysis.

Need for speed

Testing vendors are responding to the digital era's need for speed with newer, sometimes cheaper products aimed at producing almost immediate output, while blending in online behavioral metrics.

One of the newer players, Ace Metrix, was founded in 2007 on the premise that traditional copy testing was "too slow and expensive," said CEO Peter Daboll. The vendor runs ads by a 500-person panel, using nine standard questions and converting results into a single "Ace Score" that runs from zero to 950 and comprises individual elements such as likability, watchability and attention. Results can be delivered within 24 hours for a price tag that starts around $2,000 per ad.

Ace's tools include second-by-second diagnostics. So a marketer running a spot in a skippable ad unit might look closely at attention and likability scores early in the ad. To gauge the potential for an ad going viral, Ace calculates scores such as humor and "hook" (the surprise element) by using natural-language processing algorithms that analyze verbatim responses from panelists to an open-ended question asking for the "thinking and feeling" behind ratings.

WPP's Millward Brown, which has been selling a copy-testing product called Link for more than two decades, earlier this year launched a version called Link Now that uses fewer survey questions in order to deliver results in as soon as six hours. Reports show whether "you've got a surefire success here" or if the ad needs work, said Daren Poole, Millward Brown's global brand director for creative development.

Ipsos also sells a slimmed-down survey-based product called ASI: Check for $6,500 per ad, which it markets as getting "beyond the limitations of costly and time-consuming conventional research," according to a brochure.

Face time

And both vendors are moving into biometrics. Millward Brown has added "facial coding"—which uses algorithms to measure moment-to-moment facial expressions of people watching videos—to its standard ad-testing suite. Ipsos sells an optional "neuro module" with its flagship pretesting product designed for brands that want to go "beyond what people say to explore what they really feel," according to the company. Facial coding is offered as well as "implicit reaction time" methods that use response speeds to determine if an answer is "given intuitively, or if it is going through cognitive processing," according to Ipsos.

Affectiva is a Facial Coding Provider
Affectiva is a Facial Coding Provider Credit: Affectiva

Ipsos also offers ASI Live, which uses programmatic methods to serve digital ads—including native ads and social content—into the devices of its consumer panelists, who encounter them just like real ads (and they aren't told which ones are test ads). They are later surveyed on the ads, and Ipsos merges the results with online behavioral data, which Mr. Minnium called the "holy grail" of ad testing. It is "hard and expensive," he said, but "technology is allowing us to do this more quickly and at a lower cost."

Likewise, Millward Brown collects online behavioral data on test ads, like if they are viewed to completion or skipped. "It's as close to real life as you can get," Mr. Poole said.

Are you listening?

Some brands are opting to go straight to real life for digital content, using tools like social listening, rather than copy testing, to determine when to throw more money behind campaigns.

Oreo: Snack Hacks Midnight Hack by Roy Choi
Oreo: Snack Hacks Midnight Hack by Roy Choi

Lee Maicon, chief strategy officer for digital agency 360i, pointed to a recent Oreo campaign called "Snack Hacks." The idea grew from a simple Facebook post in which a fan posted an Oreo being dunked in a glass of milk using a fork. The agency noticed the post gaining traction, so it created quick videos showing things like an Oreo being put inside a pepper mill to create cookies-and-cream ice cream, while inspiring similar posts from fans. After those posts performed well, the agency grew it into full-fledged web series showing A-list chefs like Roy Choi transforming the cookie into foodie creations, like a crust for chicken tenders. The video series generated more than 5 million views, according to 360i.

"We didn't need to pretest it because we had already proven the concept not once but twice before we actually invested the money," Mr. Maicon said. "The cost of exposing people to content and then optimizing it in real time is just so much more efficient."

Contributing: Jack Neff

Most Popular