Tuesday, March 26, 2024

Highly dexterous robot hand can operate in the dark

Robotics researchers around the world have long been trying to create true dexterity in robot hands, but the goal has been frustratingly elusive. Robot grippers and suction cups can pick and lace items, but more dexterous tasks such as assembly and insertion have remained in the realm of human manipulation. However, due to advances in both sensing technology and machine-learning techniques to process the sensed data, the field of robotic manipulation is changing very rapidly.

Researchers at Columbia Engineering have demonstrated a highly dexterous robot hand that combines an advanced sense of touch with motor learning algorithms in order to achieve a high level of dexterity.

For the demonstration, the researchers chose a difficult manipulation task – executing an arbitrarily large rotation of an unevenly shaped grasped object in hand while always maintaining the object in a stable, secure hold. This is a very difficult task because it requires constant repositioning of a subset of fingers while the other fingers have to keep the object stable. The hand was able to perform this task and also did it without any visual feedback whatsoever, based solely on touch sensing.

A dexterous robot hand equipped with five tactile fingers.
A dexterous robot hand equipped with five tactile fingers. Credit: Columbia University ROAM Lab

In addition to the new levels of dexterity, the hand worked without any external cameras, so it’s immune to lighting, occlusion, or similar issues. The hand does not rely on vision to manipulate objects, which means it can do so in very difficult lighting conditions that confuse vision-based algorithms. It can even operate in the dark or in difficult lighting conditions.

“While our demonstration was on a proof-of-concept task, meant to illustrate the capabilities of the hand, we believe that this level of dexterity will open up entirely new applications for robotic manipulation in the real world,” said Matei Ciocarlie, associate professor in the Departments of Mechanical Engineering and Computer Science. “Some of the more immediate uses might be in logistics and material handling, helping ease up supply chain problems like the ones that have plagued our economy in recent years, and in advanced manufacturing and assembly in factories.”

For their work, the team designed and built a robot hand with five fingers and 15 independently actuated joints – each finger was equipped with the team’s touch-sensing technology. They then tested the ability of the tactile hand to perform complex manipulation tasks using new methods for motor learning. They used a method called deep reinforcement learning, augmented with new algorithms that they developed for effective exploration of possible motor strategies.

The input to the motor learning algorithms consisted exclusively of the team’s tactile and proprioceptive data, without any vision. Using simulation as a training ground, the robot completed approximately one year of practice in only hours of real-time, thanks to modern physics simulators and highly parallel processors. The team then transferred this manipulation skill trained in simulation to the real robot hand, which was able to achieve the level of dexterity the team was hoping for.

Ciocarlie noted that “the directional goal for the field remains assistive robotics in the home, the ultimate proving ground for real dexterity. In this study, we’ve shown that robot hands can also be highly dexterous based on touch sensing alone. Once we also add visual feedback into the mix along with touch, we hope to be able to achieve even more dexterity and one day start approaching the replication of the human hand.”

Journal reference:

  1. Gagan Khandate, Siqi Shang, Eric T. Chang, Tristan Luca Saidi, Johnson Adams, Matei Ciocarlie. Sampling-based Exploration for Reinforcement Learning of Dexterous Manipulation. arXiv, 2023. DOI: 10.48550/arxiv.2303.03486