Facial-Recognition Lets Marketers Gauge Consumers' Real Responses to Ads

Watch What They Do, Not What They Say With New Tools

By Published on .

Credit: Courtesy Affectiva

The future of ad testing is not about surveys, but about smiles and smirks.

If you want to know whether a spot really sells, after all, shouldn't you measure what people feel, not what they say? That's the argument being delivered across the country by a new crop of startups brandishing sophisticated facial-recognition technology -- and quickly reshaping how marketers decide which ads work.

One of the most successful purveyors of so-called facial coding is Affectiva, whose technology has been used by a range of big-spending marketers, including Kellogg Co., Mars Inc. and Unilever. The company's flagship product, Affdex, uses algorithms to measure and analyze the moment-to-moment facial expressions of people watching videos. Participants opt in to be viewed watching a video on a smartphone or laptop, and the program spits back data to market researchers. The system allows marketers to test ads for emotional impact on a large scale: To date, Affectiva has analyzed some 2.8 million faces across 75 countries.

"Asking someone 'How does it make you feel?' is just not as good as being able to observe in reality how they are feeling," said Affectiva CEO Nicholas Langeveld, who joined the firm in 2011 after holding executive roles at companies including Nielsen and IBM. "People have a really hard time articulating their feelings. And sometimes there is a subtle fleeting little emotion that people aren't even aware is happening."

Typically Affectiva works directly with market-research companies that incorporate the findings into their work on behalf of clients. WPP's Millward Brown began using the technology in 2011 and started including it as part of its standard ad-testing suite in January 2014. WPP was so impressed that the holding company has invested $4.5 million in Affectiva.

The technology allows clients to slice and dice data and find trends across demographics and geographic regions. And because emotions usually don't lie, facial expressions sometimes reveal responses missed by traditional question-and-answer surveys, while uncovering subtle differences between cultures.

For instance, Affdex was recently used to test a European campaign that a marketer also wanted to run in the Middle East. The spot included a scene in which a man touched a woman's arm. While survey responses reflected societal norms in which such touching would be frowned upon, facial coding revealed that "it fit within the content of the ad and it worked well," Mr. Langeveld said.

Still, it might be too early to declare the death of the traditional focus group. Affectiva proponents stress that the technology should be used to complement traditional copy-testing methods, not replace them. Facial coding is not used "in isolation," said Graham Page, executive VP-head of global research solutions, Millward Brown.

But it has become so influential that it is starting to direct big ad budgets. In the U.K., Kellogg Co. used Affectiva's technology to analyze three versions of an ad for its Crunchy Nut cereal. While one spot garnered a lot of smiles, it also produced more negative responses than another ad that was more universally liked. Kellogg put more media weight behind the better performer, according to a case study shared with Ad Age by Affectiva.

Affectiva was co-founded by Rana el Kaliouby, a former Massachusetts Institute of Technology researcher who originally created the technology as a way to help autistic children better communicate. Affectiva spun out from MIT in 2009, and operates from Waltham, Mass.

Competition is coming from the other side of the country via Emotient, which has its roots at the Machine Perception Lab at University of California, San Diego.

The company, which was founded in 2012, does ad-testing in controlled groups, but is also putting a huge emphasis on analyzing ads "in the wild." For instance, Emotient's technology might be used to analyze the facial expressions of people viewing ads on digital signs in real life.

Advertising Age Player

To avoid privacy concerns, the company does "anonymized aggregate analysis" and typically studies public venues where people are already being filmed, such as by a security camera, according to executives. "We keep the metadata," said CEO Ken Denman. "We discard the image."

Emotient recently worked with an undisclosed NBA team to analyze how faces in the crowd reacted to in-arena ads, including videos played on giant scoreboards. Theoretically, the data could be used to determine when during a game certain content gets the most attention and best reaction.

By analyzing faces in the crowd, Emotient also revealed a surprisingly basic fact: More women were attending games than the team had realized. The team "had no idea," Mr. Denman said. That finding could result in more ads from brands targeting women, he said.

In this article:
Most Popular