[Researchers at Cornell have developed a telepresence robot that automatically mirrors a user’s physical movements and body language, which appears to enable “a heightened sense of co-presence and behavioral interdependence.” The story below is from the Cornell Chronicle; the new paper describing the project appears in the CHI 2023 Proceedings and features three videos, including a 4:51 minute presentation also available on YouTube. See lead author Mose Sakashita’s website for more interesting presence-related work. –Matthew]
[Image: Mose Sakashita, a doctoral student in the field of information science, with the ReMotion robot]
I, robot: Remote proxy collaborates on your behalf
By Louis DiPietro, Cornell Ann S. Bowers College of Computing and Information Science
May 11, 2023
Cornell researchers have developed a robot called ReMotion that occupies physical space on a remote user’s behalf, automatically mirroring the user’s movements in real time and conveying key body language that is lost in standard virtual environments.
“Pointing gestures, the perception of another’s gaze, intuitively knowing where someone’s attention is – in remote settings, we lose these nonverbal, implicit cues that are very important for carrying out design activities,” said Mose Sakashita, a doctoral student in the field of information science.
Sakashita is the lead author of “ReMotion: Supporting Remote Collaboration in Open Space with Automatic Robotic Embodiment,” which he presented at the Association for Computing Machinery CHI Conference on Human Factors in Computing Systems in Hamburg, Germany in April. “With ReMotion, we show that we can enable rapid, dynamic interactions through the help of a mobile, automated robot.”
With further development, ReMotion could be deployed in virtual collaborative environments as well as in classrooms and other educational settings, Sakashita said.
The idea for ReMotion came out of Sakashita’s experience as a teaching assistant for a popular rapid prototyping course in the spring 2020 semester, which was held largely online due to COVID-19. Confined with students to a virtual learning environment, Sakashita came to understand that physical movement is vital in collaborative design projects: teammates lean in to survey parts of the prototype; they inspect circuits, troubleshoot faulty code together and then may draw up solutions on a nearby whiteboard.
This range of motion is all but lost in a virtual environment, as are the subtle ways collaborators communicate through body language and expressions, he said.
“It was super challenging to teach. There are so many tasks that are involved when you’re doing a hands-on design activity,” Sakashita said. “The kind of instinctive, dynamic transitions we make – like gesturing or addressing a collaborator – are too dynamic to simulate through Zoom.”
The lean, nearly six-foot-tall ReMotion device itself is outfitted with a monitor for a head, omnidirectional wheels for feet and game-engine software for brains. It automatically mirrors the remote user’s movements – thanks to another Cornell-made device, Neckface, which the remote user wears to track head and body movements. The motion data is then sent remotely to the ReMotion robot in real-time.
Telepresence robots are not new, but remote users generally need to steer them manually, distracting from the task at hand, researchers said. Other options such as virtual reality and mixed reality collaboration can also require an active role from the user and headsets may limit peripheral awareness, researchers added.
In a small study of about a dozen participants, nearly all reported a heightened sense of co-presence and behavioral interdependence when using ReMotion compared to an existing telerobotic system. Participants also reported significantly higher shared attention among remote collaborators.
In its current form, ReMotion only works with two users in a one-on-one remote environment, and each user must occupy physical spaces of identical size and layout. In future work, ReMotion developers intend to explore asymmetrical scenarios, like a single remote team member collaborating virtually via ReMotion with multiple teammates in a larger room.
Other co-authors are: Ruidong Zhang and Hyunju Kim, doctoral students in the field of information science; Xiaoyi Li, M.P.S. ’21; Michael Russo, M.P.S. ‘21; Cheng Zhang, assistant professor of information science; Malte Jung, associate professor of information science and the Nancy H. ’62 and Philip M. ’62 Young Sesquicentennial Faculty Fellow; and François Guimbretière, professor of information science.
This research was funded in part by the National Science Foundation and the Nakajima Foundation.
Leave a Reply