When it comes right down to it, most telepresence robots are essentially just remote-control tablets that can be steered around a room. The VRoxy system is different in that its robot replicates the user’s movements, plus it auto-pilots itself to different locations within a given space.
The system is being developed by a team of researchers from Cornell and Brown universities.
In its current functional prototype form, the VRoxy robot consists of a tubular plastic truss body with motorized omnidirectional wheels on the bottom and a video screen at the top. Also at the top are a robotic pointer finger along with a Ricoh Theta V 360-degree camera.
The remotely located user simply wears a Quest Pro VR headset in their office, home or pretty much anyplace else. This differentiates VRoxy from many other gesture-replicating telepresence systems, in which relatively large, complex setups are required at both the user’s and viewer’s locations.
Via the headset, the user can switch between an immersive live view from the robot’s 360-degree camera, or a pre-scanned 3D map view of the entire space in which the bot is located. Once they’ve selected a destination on that map, the robot proceeds to autonomously make its way over (assuming it’s not there already). When it arrives, the headset automatically switches back to the first-person view from the bot’s camera.
Not only does this functionality spare the user the hassle of having to manually “drive” the robot from place to place, it also keeps them from experiencing the vertigo that may come with watching a live video feed from the bot while it’s on the move.
The VR headset monitors the user’s facial expressions and eye movements, and reproduces them in real time on an avatar of the user, which is displayed on the robot’s screen. The headset also registers head movements, which the robot mimics by panning or tilting the screen accordingly via an articulated mount.
And when the user physically points their finger at something within their headset view, the robot’s pointer finger moves to point in that same direction in the real world. Down the road, the researchers hope to equip the robot with two user-controlled arms.
In a test of the existing VRoxy system, the team has already utilized it to navigate back and forth down a hallway between a lab and an office, where a user collaborated with different people on different tasks.
The study is being led by Cornell University’s Mose Sakashita, Hyunju Kim, Ruidong Zhang and François Guimbretière, along with Brown University’s Brandon Woodard. It is described in a paper presented at the ACM Symposium on User Interface Software and Technology in San Francisco.
Source: Cornell University