ETH Zurich researchers have developed a new control approach that enables a legged robot, called ANYmal, to move quickly and robustly over difficult terrain. Led by ETH Zurich robotics professor Marco Hutter, the team’s machine learning technology allows the robot to combine its visual perception of the environment with its sense of touch for the first time.
The team trained the quadruped to walk up nearby Mount Etzel, a modest summit that stands some 1,098 meters (3,600 feet) above sea level. The robot overcomes the 120 vertical meters effortlessly in a 31-minute hike – that’s 4 minutes faster than the estimated duration for human hikers – and with no falls or missteps.
“The robot has learned to combine visual perception of its environment with proprioception – its sense of touch – based on direct leg contact. This allows it to tackle rough terrain faster, more efficiently, and, above all, more robustly,” Hutter says. In the future, ANYmal can be used anywhere that is too dangerous for humans or too impassable for other robots.
Researchers say the new controller based on a neural network combines external and proprioceptive perception for the first time. Before the robot could put its capabilities to the test in the real world, the scientists exposed the system to numerous obstacles and sources of error in a virtual training camp. This lets the network learn the ideal way for the robot to overcome obstacles, as well as when it can rely on environmental data – and when it would do better to ignore that data.
“With this training, the robot is able to master the most difficult natural terrain without having seen it before,” says ETH Zurich Professor Hutter. This works even if the sensor data on the immediate environment is ambiguous or vague. ANYmal then plays it safe and relies on its proprioception. According to Hutter, this allows the robot to combine the best of both worlds: the speed and efficiency of external sensing and the safety of proprioceptive sensing.