Since the announcement, and now the release, of the iPhone X, facial recognition has quickly become a topic of household conversation. As they are wont to do, Apple is taking a technology that previously existed in niche or academic domains and introducing it to popular culture (amid great expectation). And while the key feature is a security function -- unlocking your phone -- a mass consumer device that can recognize faces is clearly a powerful technology with potentially vast implications.
By some estimates, 90% of personal communication is nonverbal. Regardless of the math, it's clear we lose a ton of signals in digital communication because we don't understand the nonverbal cues of the person on the other side of the screen. We've been using emojis and LOLs to make up for not being able to share emotion in our digital interactions.
For anyone in the communication business, genuinely understanding the customer is the difference between a good experience that leads to trust and loyalty, and a bad experience that leads to long-term brand rejection. In the future, good brands could be even better if marketers can obtain real-time measurements of how a given experience is affecting a customer, and react dynamically.
It's easy to paint a malicious scenario in which our most personal identity, our own image, is constantly being tracked. Or worse. In "1984," George Orwell's dystopian vision included "facecrime," where a person is presumed guilty of 'thoughtcrime' based on their facial expressions. As with most technology, facial recognition will inevitably be used for good and for bad. In some cases, it'll just be very poorly executed. It will most certainly have growing pains.
But the more likely, and promising, scenario for the world's best marketers is that consumers will be willing to share their facial expressions and emotional analysis for the convenience, tailoring and rewards it offers them.
The future of face recognition
Imagine if digital-led customer service functions were informed and prioritized according to your true emotional state, allowing a brand to respond to you more appropriately. Or imagine a financial service firm that knows how you're feeling and helps you avoid a frivolous purchase, or lets you know when your skepticism may be holding you back from something of true value. Imagine launching something in beta and not just hearing from a small, angry faction posting 1-star reviews, but surveying, in real-time, the aggregate joy that the vast majority of your customers are experiencing. Up until now, in the digital experience world, we've been focused on click-path analysis. But what if we could begin to explore emotional-path analysis?
Technically speaking, Apple is keeping the most detailed facial recognition data local on the phone (i.e., the company isn't storing facial recognition data used to unlock the phone on its servers), but app makers can (with the user's permission) use the iPhone X to read a rough map of a user's face and a stream of facial expressions. It may not be Apple's intention to ever fully open the door, but it may be the outcome of the new user experience that makes facial recognition commonplace in our behaviors across devices and hardware manufacturers.
For years, the advertising industry has been consumed with making digital experiences more human. It started with mobile devices, touch screens and intuitive taps, and it has evolved all the way to natural language-based AI. Now we're faced with designing for faces -- an incredibly complex and uniquely human feature, but also one that could tell us more about our customers than we've ever known.