It is always a challenge for programmers and engineers to get real, physical movements accurately portrayed in the digital world.
Now, scientists from ETH Zurich and New York University have developed a user-friendly, stretch-sensing data glove to capture real-time, interactive hand poses with much more precision.
Most existing gesture sensing gloves are bulky, most expensive, and only measure relatively few degrees of freedom. These gloves can be manufactured at low costs using fabrication methods and tools that are easily available today.
Researchers created a silicon compound in the shape of a hand with 44 embedded stretch sensors and combined it with soft, thin fabric. These gloves are very comfortable and unobtrusive to wear.
The input device uses a special constructed set of algorithms to process the sensor data coming through from the gloved hand. The data-driven model is trained only once and to gather training data; the researchers use an inexpensive, off-the-shelf hand pose reconstruction system.
What’s more interesting and advantageous is that the team says these stretch-sensing gloves do not require a camera-based set-up or any additional external equipment. Additionally, they could begin tracking hand poses in real-time with only minimal calibration.
For the study, researchers compared their sensor glove’s accuracy to two commercial glove products, and they found that the glove received the lowest error return for each interactive pose.
In the future work, the team is planning to explore how a similar sensor approach could be used to track a whole arm to get the global position and orientation of the glove, or perhaps even a full bodysuit. Now the glove is middle-sized one, but the team is also thinking to expand it to other sizes and shapes.
The team will demonstrate its innovative glove at SIGGRAPH 2019, held 28 July-1 August in Los Angeles. The paper about the research is available online.