Design activities, such as brainstorming or critique, often take place in open spaces combining whiteboards and tables to present artifacts. s. In co-located settings, peripheral awareness enables participants to understand each other’s locus of attention easily. However, these spatial cues are mostly lost while using videoconferencing tools. Telepresence robots could bring back a sense of presence but controlling them is distracting.
To address this problem, Cornell University researchers have developed a robot called ReMotion that occupies physical space on a remote user’s behalf, automatically mirroring the user’s movements in real-time and conveying key body language that is lost in standard virtual environments.
“Pointing gestures, the perception of another’s gaze, intuitively knowing where someone’s attention is – in remote settings, we lose these nonverbal, implicit cues that are very important for carrying out design activities,” said Mose Sakashita, a doctoral student in the field of information science.
The lean, nearly six-foot-tall ReMotion device is equipped with a monitor for a head, omnidirectional wheels for feet, and game-engine software for brains. It automatically mirrors the remote user’s movements – thanks to another Cornell-made device, Neckface, which the remote user wears to track head and body movements. The motion data is then sent remotely to the ReMotion robot in real time.
Researchers said telepresence robots are not new, but remote users generally need to steer them manually, distracting from the task at hand. Other options, such as virtual reality and mixed reality collaboration, can also require an active role from the user, and headsets may limit peripheral awareness, researchers added.
In a small study of about a dozen participants, nearly all reported a heightened sense of co-presence and behavioral interdependence when using ReMotion compared to an existing telerobotic system. Participants also reported significantly higher shared attention among remote collaborators.
Currently, ReMotion only works with two users in a one-on-one remote environment, and each user must occupy physical spaces of identical size and layout. In future work, ReMotion developers intend to explore asymmetrical scenarios, like a single remote team member collaborating virtually via ReMotion with multiple teammates in a larger room.
With further development, researchers said ReMotion could be deployed in virtual collaborative environments, classrooms, and other educational settings.
Journal reference:
- Mose Sakashita, Ruidong Zhang, Xiaoyi Li, Hyunju Kim, Michael Russo, Cheng Zhang, Malte F. Jung, François Guimbretière. ReMotion: Supporting Remote Collaboration in Open Space with Automatic Robotic Embodiment. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 2023. DOI: 10.1145/3544548.3580699