VIDEO-The Technology that Unmasks Your Hidden Emotions - WSJ

Jan. 28, 2015 2:13 p.m. ET

Paul Ekman, perhaps the world’s most famous face reader, fears he has created a monster.

The 80-year-old psychologist pioneered the study of facial expressions in the 1970s, creating a catalog of more than 5,000 muscle movements to show how the subtlest wrinkling of the nose or lift of an eyebrow reveal hidden emotions.

Now, a group of young companies with names like Emotient Inc., Affectiva Inc. and Eyeris are using Dr. Ekman’s research as the backbone of a technology that relies on algorithms to analyze people’s faces and potentially discover their deepest feelings. Collectively, they are amassing an enormous visual database of human emotions, seeking patterns that can predict emotional reactions and behavior on a massive scale.

Dr. Ekman, who agreed to become an adviser to Emotient, says he is torn between the potential power of all this data and the need to ensure it is used responsibly, without infringing on personal privacy.

So far, the technology has been used mostly for market research. Emotient, a San Diego startup whose software can recognize emotions from a database of microexpressions that happen in a fraction of a second, has worked with Honda Motor Co. and Procter & Gamble Co. to gauge people’s emotions as they try out products. Affectiva, an emotion-detection software maker based in Waltham, Mass., has used webcams to monitor consumers as they watch ads for companies like Coca-Cola Co. and Unilever PLC.

The evolving technology has the potential to help people or even save lives. Cameras that could sense when a trucker is exhausted might prevent him from falling asleep at the wheel. Putting cameras embedded with emotion sensing software in the classroom, could help teachers determine whether they were holding their students’ attention.

But other applications are likely to breed privacy concerns. One retailer, for instance, is starting to test software embedded in security cameras that can scan people’s faces and divine their emotions as they walk in and out of its stores. Eyeris, based in Mountain View, Calif., says it has sold its software to federal law-enforcement agencies for use in interrogations.

The danger, Dr. Ekman and privacy advocates say, is that the technology could reveal people’s emotions without their consent, and their feelings could be misinterpreted. People might try to use the software to determine whether their spouse was lying, police might read the emotions of crowds or employers might use it to secretly monitor workers or job applicants.

“I can’t control usage,” Dr. Ekman says of his catalog, called the Facial Action Coding System. “I can only be certain that what I’m providing is at least an accurate depiction of when someone is concealing emotion.”

In Dr. Ekman’s analysis, there is no such thing as a simple smile or a frown. Facial movements are broken down into more-nuanced expressions; there are seven ways a forehead can furrow.

Dr. Ekman’s atlas has been used extensively by psychologists and by law-enforcement and military personnel—including interrogators at the Abu Ghraib prison in Iraq—and was the inspiration for the TV drama “Lie to Me.”

To train its software’s algorithm, Emotient has recorded the facial reactions of an ethnically diverse group of hundreds of thousands people participating in marketing research for its clients via video chat. The software extracts at least 90,000 data points from each frame, everything from abstract patterns of light to tiny muscular movements, which are sorted by emotional categories, such as anger, disgust, joy, surprise or boredom.

Rival Affectiva says it has measured seven billion emotional reactions from 2.4 million face videos in 80 countries. The company says the sheer scope of its data has allowed it to draw conclusions about people across cultures and in different settings. For instance, it says it has learned that women smile more than men, and that Indonesians and South Africans are the world’s least and most expressive people, respectively.

The startups share the goal of embedding their software in the tiniest of cameras. Affectiva is teaming up with OoVoo LLC, a video-chat service for smartphones that has 100 million users, to build an app that could reveal people’s emotions during mobile video chats.

Its peers, too, are expanding their reach. A pediatrics researcher at the University of San Diego is testing a version of Emotient software on children who have had appendix surgery, to see whether it can signal their level of pain. An unidentified retailer is using Emotient’s software in its security cameras to gauge whether shoppers are pleased when looking at products and leaving the store.

WSJ.D

WSJ.D is the Journal’s home for tech news, analysis and product reviews.

Eyeris says it envisions therapeutic apps that could detect when a person feels stress. The company said it has struck deals with federal law-enforcement authorities, but declined to identify them.

Emotient says it prefers not to have its software used for police work or federal security matters. Affectiva says it has turned down funding offers from federal intelligence agencies.

As with many other technologies, emotion-detection software raises all sorts of privacy questions. “I can see few things more invasive than trying to record someone’s emotions in a database,” said Ginger McCall, a privacy advocate.

In the mid-2000s, former detective Charles Lieberman trained detectives in the New York Police Department’s counterterrorism unit in Dr. Ekman’s facial-coding system. He said the technology could help interrogators if they could identify inconsistencies between a suspect’s story and emotions revealed on his or her face. But, he cautioned, it is important to “recognize its limitations—it can lead you in the right direction but is not definitive.”

Problems could also arise if the software isn’t perfectly accurate. Emotions, such as sadness or frustration, could be wrongly interpreted. People could be wrongly pegged as liars. Dr. Ekman says Emotion’s software is highly accurate, but the accuracy of the system hasn’t been independently tested.

With no regulation, the companies are writing the privacy rules as they go.

Ken Denman, CEO of Emotient, says his company makes a point of discarding the images of individual faces within seconds after it has logged the sentiment they express. “There’s very little value in the facial expression of any individual,” he said.

Affectiva says it stores videos of faces only if the person involved consents. On mobile phones, the work of converting microexpresssions to data points takes place on the phone for later analysis. No images are sent back to the company.

Both Affectiva and Emotient acknowledge they have no control over how third parties using their software might store or use images of people’s faces and emotions.

Dr. Ekman says he hopes the government will step in and write rules to protect people. He says that in public spaces, such as shopping malls, consumers should at least be informed if their emotions are captured.

Dr. Ekman says he believes that, on balance, his tools have done more good than harm. But the new technology’s ability to instantaneously scan the emotions of crowds of people would be much easier to abuse.

“People don’t even know that that’s possible,” he adds.

Write to Elizabeth Dwoskin at elizabeth.dwoskin@wsj.com and Evelyn M. Rusli at evelyn.rusli@wsj.com

http://www.wsj.com/articles/startups-see-your-face-unmask-your-emotions-1422472398?mod=LS1&ref=/news/technology