Human facial movements convey emotions and help us communicate nonverbally, and perform physical activities, such as eating and drinking.
Last year, Cornell University engineers unveiled an interesting research project, the C-Face, capable of identifying facial expressions even when they’re wearing a face mask using wearable cameras. Now the technology has taken a more practical form.
The team has developed a necklace-type wearable called NeckFace, which can continuously track full facial expressions by using infrared cameras instead of the RGB cameras seen in the original and captures images of the chin and face from beneath the neck. Using those images, the system can track the wearer’s facial movements and produce a 3D reconstruction of their expressions.
Unlike C-Face, NeckFace provides significant improvement in performance and privacy and gives the wearer the option of a less-obtrusive neck-mounted device.
To test the effectiveness of the smart necklace, the team conducted a user study with 13 participants, each of whom was asked to perform eight facial expressions while sitting and eight more while walking. The participants were also asked to rotate the head while performing the facial expressions and remove and remount the device, amounting to the expression of 52 different facial shapes in all. The group compared the performance of the necklace system with the TrueDepth 3D camera found on the iPhone X and found it to be nearly as accurate.
NeckFace was tested in two designs: as a neckband with twin cameras and as a necklace with a pendant-like infrared (IR) camera device. The neckband was found to be more accurate than the necklace, the researchers said, possibly because two cameras on the neckband could capture more information from both sides than could the center-mounted necklace camera.
When optimized, the device could be particularly useful in the mental health realm for tracking people’s emotions over the course of a day. Other potential applications include virtual conferencing when a front-facing camera is not an option; facial expression detection in virtual reality scenarios; and silent speech recognition.
“Can we actually see how your emotion varies throughout a day?” said the team leader Cheng Zhang. “With this technology, we could have a database on how you’re doing physically and mentally throughout the day, and that means you could track your own behaviors. And also, a doctor could use the information to support a decision.”