Monday, April 15, 2024

New artificial skin gives robots sense of touch and beyond

Modern robots are playing a more and more important role in security, farming, and manufacturing. Now researchers are working on giving these robots a sense of touch similar to humans.

Researchers at the California Institute of Technology (Caltech) have developed an artificial skin that gives robots a sense of touch that is similar to humans. The skin allows robots to sense temperature, pressure, and even toxic chemicals through a simple touch.

The multimodal robotic-sensing platform, dubbed M-Bot, integrates the artificial skin with a robotic arm and sensors that attach to human skin. A machine-learning system that interfaces the two allows the human user to control the robot with their own movements while receiving feedback through their own skin. The platform aims to give humans more precise control over robots while also protecting humans from potential hazards.

Human fingers are soft, squishy, and fleshy, whereas robotic fingers tend to be hard, metallic, plasticky, or rubbery. The printable skin is a gelatinous hydrogel and makes robot fingertips a lot more like our own. Embedded within that hydrogel are the sensors that give the artificial skin its ability to detect the world around it.

This sensor attaches to the forearm skin of a human and allows them to control a robotic system through their own muscle movements.
This sensor attaches to the forearm skin of a human and allows them to control a robotic system through their own muscle movements. Credit: Caltech

“Inkjet printing has this cartridge that ejects droplets, and those droplets are an ink solution, but they could be a solution that we develop instead of regular ink,” says Wei Gao, Caltech’s assistant professor of medical engineering. “We’ve developed a variety of inks of nanomaterials for ourselves.”

After printing a scaffolding of silver nanoparticle wires, the researchers can then print layers of micrometer-scale sensors that can be designed to detect a variety of things. The fact that the sensors are printed makes it quicker and easier for Gao’s lab to design and try out new kinds of sensors.

“When we want to detect one given compound, we make sure the sensor has a high electrochemical response to that compound,” Gao says. “Graphene impregnated with platinum detects the explosive TNT very quickly and selectively. For a virus, we are printing carbon nanotubes, which have a very high surface area and attaching antibodies for the virus to them. This is all mass-producible and scalable.”

The research team has coupled this skin to an interactive system that allows a human user to control the robot through their own muscle movements while also receiving feedback from the user’s own skin from the skin of the robot. The electrodes fastened to the human operator’s forearm are positioned to sense the electrical signals generated by the operator’s muscles as they move their hand and wrist. “We used machine learning to convert those signals into gestures for robotic control,” Gao says. “We trained the model on six different gestures.”

Gao hopes the system will find applications in everything from agriculture to security to environmental protection, allowing the operators of robots to “feel” how much pesticide is being applied to a field of crops, whether a suspicious backpack left in an airport has traces of explosives on it or the location of a pollution source in a river. First, though, he wants to make some improvements.

“I think we have shown a proof of concept,” he says. “But we want to improve the stability of this robotic skin to make it last longer. By optimizing new inks and new materials, we hope this can be used for different kinds of targeted detections. We want to put it on more powerful robots and make them smarter, more intelligent.”