These days technologies are increasingly being deployed in society to detect facial emotions and expressions. However, researchers have now pointed to some algorithms’ designed to read emotions are failed miserably at spotting liars.
A group of researchers from the University of Southern California (USC) put the AI’s ability to detect the lie to test and the team found that the algorithms failed basic tests as truth detectors. The team completed a pair of studies using science that undermines popular psychology and AI expression understanding techniques. Both of this technique assumes that facial expressions reveal what people are thinking.
“Both people and so-called ’emotion reading’ algorithms rely on folk wisdom that our emotions are written on our face,” said Jonathan Gratch, a professor of computer science at the USC.
The team has presented their findings at an International Conference on Affective Computing and Intelligent Interaction in Cambridge, England.
We all know that people can lie without showing obvious signs of it on their face, for instance, your average politician cheerfully utters false statements. The concept of a ‘poker face’ that masks one’s feelings of emotions is not new, but algorithms are not a god at catching duplicates. Yet, such types of machines are increasingly deployed to read human emotions and inform life-changing decisions.
For their study, the team designed a game where 700 people’s faces were tracked when playing for money. Then they asked the participants (Emotion-reading algorithms) to review their behavior and provide insights into how they were using expressions to gain advantage and if their expressions matched their feelings.
The results show that the smiles were the most common facial expression, regardless of what participants were actually feeling. Besides, the participants were mostly unable or poor at judging facial emotion, suggesting people smile for lots of reasons, not just happiness.
“These discoveries emphasize the limits of technology use to predict feelings and intentions,” Gratch said. Researchers argued that commonly used emotion-reading algorithms usually exchange what they are looking at.