NASA’s Jet Propulsion Laboratory is testing a versatile, snake-like robot that would autonomously map, traverse, and explore previously inaccessible destinations.
Called EELS (Exobiology Extant Life Surveyor), the self-propelled autonomous robot is meant to one day climb into the icy vents of Saturn’s moon Enceladus and take a dip into its subsurface ocean below to look for signs of life.
EELS could pick a safe course through a wide variety of terrain on Earth, the Moon, and far beyond, including undulating sand and ice, cliff walls, craters too steep for rovers, underground lava tubes, and labyrinthine spaces within glaciers.
The project team began building the first prototype in 2019 and has been making continual revisions. They’ve been trying out white, 3D-printed plastic screws for testing on looser terrain like sand and soft snow, as well as sharper, black metal screws for ice.
In its current form, the EELS 1.0 robot weighs about 220 pounds (100 kilograms) and is 13 feet (4 meters) long. It’s composed of 10 identical segments that rotate, using screw threads for propulsion, traction, and grip. These individual segments can even act as propellers, allowing the EELS robot to explore its surroundings underwater.
“It has the capability to go to locations where other robots can’t go. Though some robots are better at one particular type of terrain or other, the idea for EELS is the ability to do it all,” said JPL’s Matthew Robinson, EELS project manager.
The robot has been put to the test in sandy, snowy, and icy environments, from the Mars Yard at JPL to a ‘robot playground’ created at a ski resort in the snowy mountains of Southern California, even at a local indoor ice rink. Because of the long communications lag time between Earth and deep space, EELS is designed to autonomously sense its environment, calculate risk, travel, and gather data with yet-to-be-determined science instruments. When something goes wrong, the goal is for the robot to recover on its own, without human assistance.
The snake robot creates a 3D map of its surroundings using four pairs of stereo cameras and lidar, which is similar to radar but employs short laser pulses instead of radio waves. With the data from those sensors, navigation algorithms figure out the safest path forward.
In its final form, the robot will contain 48 actuators that give it the flexibility to assume multiple configurations but add complexity for both the hardware and software teams.
“When you’re going places where you don’t know what you’ll find, you want to send a versatile, risk-aware robot that’s prepared for uncertainty – and can make decisions on its own,” said Matthew Robinson.