The business of emotion detection has moved beyond the science fiction to a $20 billion industry in just a few years. Companies such as IBM and Microsoft, using artificial intelligence to identify how people feel, promote software that can analyse facial expressions and match them to certain emotions. This, they believe, would be a superpower in determining customer’s response to a new product or job candidate’s feelings during an interview. However, a comprehensive review of the research on emotion-detection suggests that the algorithm used by such technologies is deeply flawed because the majority of emotion-detection Artificial Intelligence makes, inferences purely on mapping facial positioning. It is not surprising that something as complex as a human emotion can be judged by her facial expressions alone.
“About 20 to 30 percent of the time, people make the expected facial expression,” such as smiling when happy, said Lisa Feldman Barrett, a professor of psychology at Northeastern University, who worked on the report published earlier this month. But the rest of the time, they don’t. “They’re not moving their faces in random ways. They’re expressing emotion in ways that are specific to the situation.”
A group of five scientists from the Association for Psychological Science, after reviewing more than 1000 studies and spending two years exploring this idea, concluded that the relationship between facial expression and emotion is nebulous, convoluted and far from universal.
Microsoft, in 2015, presenting its emotion detection technology said its algorithms could “recognise eight core emotional states — anger, contempt, fear, disgust, happiness, neutral, sadness or surprise — based on universal facial expressions that reflect those feelings.”
Barrett commented that this is a common justification for such technology and that these kinds of technological advancements come with limitations. The companies are not trying to be misleading, she said, but they need to change their approach to detect human emotion to get the expected results. To get on the right track, Barrett said, companies should be working with far more data, training their programs to consider body positioning, vocal characterization, and situational context just as a human would.
Nonetheless, the scientific claim to detect an emotion nowhere justifies its assessment of the human brain.