VRoxy system pushes telepresence beyond just looking and talking

[Below you’ll find first a short description of the new VRoxy (VR-driven robotic proxy) telepresence system being developed by teams at Cornell and Brown Universities, and then a more detailed description of how the system works. The first story is from New Atlas (based on a report from Cornell), and the second is from Brown University (where the original version includes three different pictures and a five-minute demonstration video that’s also available on YouTube). As Cornell’s coverage notes, VRoxy is a refined version of ReMotion, which was featured in a May 2023 ISPR Presence News post. –Matthew]

[Image: Screenshot of title card from demonstration video]

VRoxy system pushes telepresence beyond just looking and talking

By Ben Coxworth
November 1, 2023

When it comes right down to it, most telepresence robots are essentially just remote-control tablets that can be steered around a room. The VRoxy system is different in that its robot replicates the user’s movements, plus it auto-pilots itself to different locations within a given space.

The system is being developed by a team of researchers from Cornell and Brown universities.

In its current functional prototype form, the VRoxy robot consists of a tubular plastic truss body with motorized omnidirectional wheels on the bottom and a video screen at the top. Also at the top are a robotic pointer finger along with a Ricoh Theta V 360-degree camera.

The remotely located user simply wears a Quest Pro VR headset in their office, home or pretty much anyplace else. This differentiates VRoxy from many other gesture-replicating telepresence systems, in which relatively large, complex setups are required at both the user’s and viewer’s locations.

Via the headset, the user can switch between an immersive live view from the robot’s 360-degree camera, or a pre-scanned 3D map view of the entire space in which the bot is located. Once they’ve selected a destination on that map, the robot proceeds to autonomously make its way over (assuming it’s not there already). When it arrives, the headset automatically switches back to the first-person view from the bot’s camera.

Not only does this functionality spare the user the hassle of having to manually “drive” the robot from place to place, it also keeps them from experiencing the vertigo that may come with watching a live video feed from the bot while it’s on the move.

The VR headset monitors the user’s facial expressions and eye movements, and reproduces them in real time on an avatar of the user, which is displayed on the robot’s screen. The headset also registers head movements, which the robot mimics by panning or tilting the screen accordingly via an articulated mount.

And when the user physically points their finger at something within their headset view, the robot’s pointer finger moves to point in that same direction in the real world. Down the road, the researchers hope to equip the robot with two user-controlled arms.

In a test of the existing VRoxy system, the team has already utilized it to navigate back and forth down a hallway between a lab and an office, where a user collaborated with different people on different tasks.

The study is being led by Cornell University’s Mose Sakashita, Hyunju Kim, Ruidong Zhang and François Guimbretière, along with Brown University’s Brandon Woodard. It is described in a paper presented at the ACM Symposium on User Interface Software and Technology in San Francisco.

[From Brown University]

Researchers develop VR software to control a robot proxy through natural movements

Called VRoxy, the software has the potential to make hands-on collaboration between people working remotely and people working in physical spaces more seamless, regardless of differences in room size.

MEDIA CONTACT: Juan Siliezar, juan_siliezar@brown.edu
October 26, 2023

PROVIDENCE, R.I. [Brown University] — To date, virtual reality has been widely associated with gaming. But as this immersive technology improves, it will increasingly make its way to other spheres of life, including work, impacting how people miles apart or even across the world can collaborate.

One effort that could soon inform VR applications in professional settings comes from researchers at Brown University and Cornell University, who will present on the concept at this year’s Association for Computing Machinery Symposium on User Interface Software and Technology on Oct. 29.

VRoxy, the software program developed by the team, allows a user in a remote location to put on a VR headset and be virtually transported to a space — be it an office, laboratory or other setting — where their colleagues are physically working. From there, the remote user is represented through a robot present at the physical location, allowing them to move through that environment using natural movements, like walking, and collaborate with colleagues through gestures such as pointing at objects, head animations (like nodding) and even making facial expressions through the robot proxy.

The software is in its early stages. But already the researchers say it has the potential to address some of the biggest challenges in using robot proxies and augmented reality software for remote collaboration.

“Right now, motion-controlled robots for collaboration require the physical location to have the same amount of space as the remote environment, but that’s often not the case,” said Brandon Woodard, a Ph.D. student at Brown and graduate researcher for the Department of Computer Science’s Visual Computing Group. “Rooms have different dimensions. Even when meeting with people in Zoom, it’s easy to see those differences. Some could be in home offices, others in kitchens or living rooms, while others are in the work space or classroom.”

When remote users are in much smaller spaces than the physical locations they virtually visit, that presents a major stopping point for augmented reality technology in terms of full immersion and navigating that virtual space. For instance, a college student can’t physically get to the other side of a large lab in which they are virtually present if their remote space is physically smaller, like a residence hall room. To do this with a mobile robot proxy, it requires users to manually maneuver the robot with some type of remote control, which draws some of their attention away.

“These actions require significant amounts of cognitive load,” said Mose Sakashita, a Ph.D. student at Cornell and the study’s lead author. “You have to focus on controlling the robot instead of focusing on the collaboration task.”

VRoxy helps to address this, the researchers say. The primary way is by introducing an innovative mapping technique that addresses differences in room size, allowing a user to walk around in a small space while a mobile robot automatically reproduces their movement in a physical location.

Here’s how it works: The remote space where the VR user will be and the physical space where they want to be are mapped using visualization software. The physical location is then rendered into a 3D model. This 3D environment is what the VR user sees in their headset. While in there, the user sees — as blue circles on the ground — what the researchers call teleport links. These links allow the larger space to be compressed into the smaller one within the VR world. The VR user can step into these links and when they do, the screen will fade to black and they’ll be in another spot. This could be slightly further into a room, the other side of that room or even another building. The robot proxy moves through the physical space as the user goes through the teleport links without the VR user needing to manually control it. If the user goes through a link to another building, another robot would activate.

“It’s almost like if you’re playing a video game, you go into a portal and you get warped somewhere but instead of the robot actually warping somewhere because it’s in a real physical space, it just knows to navigate to the spots where the teleport links are,” Woodard said.

The teleport links eventually brings the VR user to designated task areas. This is where the 3D digital world the VR user sees, which runs on the simulation engine Unity, fades away and a live feed of the environment turns on. In these spaces, the VR user can move around as if they were there and the proxy robot recreates their movements in that space using movement tracking software on the VR headset.

Currently, the VR user can point at objects, lean in close for better looks and even walk sideways along a shelf to look for objects. The system also captures head rotations and head nods. The Quest Pro headset the researchers are using can also capture facial expressions, which are shown through an attached screen on the prototype robot.

When not in the live-view, the collaborators in the physical space appear as 3D avatars to the VR user through the use of a motion sensing system called Azure Kinect.  This helps the VR user understand where people are while navigating the space.

The researchers can see their software being used in collaborative, hands-on environments when someone physically can’t be in the space where the work is happening, and the room they are in is much different than the in-person space.

Now that the initial proof of concept is complete, the research team hopes to build upon the software and the capabilities of the proxy robot, including it being able to grab and manipulate objects.

This work was supported by National Science Foundation and the Nakajima Foundation. The prototype robot used for VRoxy was built and is based at Cornell in Ithaca, N.Y. Woodard worked remotely to develop software for the travel ports and also test VRoxy from Brown’s campus in Providence, R.I.


Comments


Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News:



Archives