The new iCub robot advanced telexistence system, also called the iCub3 avatar system – developed by researchers at IIT-Istituto Italiano di Tecnologia (Italian Institute of Technology) – was tested in an online demonstration involving a human operator based in IIT, Genoa. For the demonstration, the humanoid robot iCub 3 was located at the 17th International Architecture Exhibition – La Biennale di Venezia. And the human operator was 300 km far away at an IIT lab in the city of Genoa, and the communication between the two relied on a basic optical fiber connection.
Researchers demonstrated that the system transports the operator locomotion, manipulation, voice, and facial expressions to the robotic avatar while receiving visual, auditory, haptic, and touch feedback. For the first time, the robot with all these features is tested using a legged humanoid robot for remote tourism so that the human operator may feel and experience where the avatar is.
The iCub 3 robot is 25 cm taller and 19 kg heavier than the previous iCub versions, thus measuring 1.25 meters and tipping the scales at 52 kg (115 lb). It features a total of 53 actuated degrees of freedom – seven in each arm, nine in each hand, six in the head, three in the torso/waist, and six in each leg.
The iCub 3 robot has more powerful motors in its legs for faster walking speeds and more human-like balance and locomotion. The robot’s head has swiveling stereo cameras that serve as eyes, dual microphones, and animated lines of LEDs, representing its mouth and eyebrows. It has an additional depth camera and force sensing of the latest generation withstanding higher robot weight. Lastly, iCub 3 has a higher capacity battery, which is located within the torso assembly instead of being included in a rigidly attached backpack.
The iCub3 avatar system is mainly composed of the robot iCub3 and the wearable technologies – named iFeel. An advanced software architecture controls and manages the interconnection between the iCub 3 robot and the iFeel system.
In the demo, the humanoid robot utilized the IIT wearable iFeel suite tracks operator’s body motions, and the avatar system transfers them onto the iCub 3 in Venice. The user is also provided with a VR headset that tracks the user’s expressions, eyelids, and eye motions to pick up their voice. These head features are projected onto the avatar, which reproduces them with a high level of fidelity: avatar and human share very similar facial expressions. The user wears sensorized gloves that track his hand motions and, at the same time, provide haptic feedback.
Thanks to the avatar system, the remote user can smile, talk, shake the hand with the guide in Venice, and even hug them: his avatar smiles, talks, and handshakes accordingly. The haptic feedback units in the torso of the bodysuit let the operator feel that hug. The transmission was streamed on a standard optical fiber internet connection, with a lag of only a few milliseconds.
The system is a prototype and may be further developed to be applied in a different scenario, from disaster response to healthcare and to metaverse too.
“We believe that this research direction has a tremendous potential in many fields,” explains Daniele Pucci, Principal Investigator of the Artificial and Mechanical Intelligence (AMI) Lab at IIT in Genova. “On the one hand, the recent pandemic taught us that advanced telepresence systems might become necessary very quickly across different fields, like healthcare and logistics. On the other hand, avatars may allow people with severe physical disabilities to work and accomplish tasks in the real world via the robotic body. This may be an evolution of rehabilitation and prosthetics technologies.”