AMBIDEX cable-driven robot learns to wash dishes, peel vegetables

Engineers from NAVER LABS have developed the AMBIDEX cable-driven robot capable of precise movements and taught it to learn humans’ physical intelligence. The robot can memorize actions and perform them autonomously, for example, peel a vegetable and catch a flying ball.

Developed by NAVER LABS, with Korea University of Technology & Education (KOREATECH), the robot arm now features an added waist, extending the available workspace, as well as a sensor head that can perceive objects. It has also been equipped with a robot hand “BLT Gripper” that can change to various grasping methods.

The robot arms move thanks to cables inside the body that act like tendons. Their state and position are monitored by a neural network algorithm, which, based on the motors’ position and the operator’s commands, calculates the dynamics of the movement of the cables, taking into account their elasticity.

The operator stand has a simpler design that differs from the robot design and uses electric motors and cables, which allows both devices (robot and bench) to maintain the same level of resistance when moving. This also allows the AMBIDEX to learn from examples, capturing movement parameters visually and kinematically, memorizing hand positions, and efforts. Engineers have shown this with the example of washing dishes and plugging an electrical plug into an outlet. The robot can also correct its movements if a person or other object moves it.

“While spreading jam on bread or taking coins out of a pocket is an easy task that humans can unconsciously do, it is difficult to explain or program such processes and principles,” engineers said in a statement. To overcome this, they developed a haptic device exclusively for AMBIDEX.

This haptic device features a 1:1 size ratio with actual humans, 7 degrees of freedom (per arm), just like a human arm, and bilateral teleoperation, which delivers force both ways between the human and the robot. The engineers could obtain detailed force control data from a human demo with this device and use it as a learning reference for the robot. Training the robot using various methods such as reinforcement learning via the haptic device is so efficient that the robot can successfully perform a task independently from a single demo.