‘Real Virtuality’ gives glimpse at VR’s multiplayer future

[This story about the Real Virtuality multiplayer VR system is from Wired, where it features two videos; coverage in Popular Science calls the torch pass featured in one of the videos (and the screenshot below) a possible “’first contact’ moment for virtual reality gaming–a small step but one that makes many things possible in the future. If it doesn’t seem impressive, consider all of the points of representation that are acting in this virtual reality scenario: the environment, the torch’s play on the lighting, and the other person in the game. They’re all rendered without delay. Humanity has essentially recreated, on the basest level, The Matrix.” –Matthew]

Torch pass screenshot from Real Virtuality

‘Real Virtuality’ gives glimpse at VR’s multiplayer future

14 July 15 by Michael Rundle

There are lots of ideas for how multiplayer will work in virtual reality — from online massively multiplayer games like EVE: Valkyrie, to Oculus Touch’s virtual hands. But it doesn’t feel like anyone has quite nailed the delicate personal interaction needed to make meeting another human realistic in VR.

A collaboration between main partners Artanim and Kenzan Media Labs, Real Virtuality relies not on standard equipment or cameras, but on a custom-built, room-sized infrared motion capture system. As a result it’s not exactly practical for the home — but the results speak for themselves. Using Artanim’s motion-capture and game engine, and Kenan’s modelling and visual effects, the system is able to capture the movement and subtle interactions of two players, and recreate them perfectly in-game.

In effect the system, presented by Artanim as part of the Siggraph allows players to walk, talk and interact with objects in the 3D world just as in real life — because each of them is an object in real life too.

“Visitors are able to explore the tomb by foot, to illuminate their surroundings by holding a virtual torch and to interact with the environment — for example contextual information is displayed on specific hieroglyphs by touching them,” the team explains.

The aim is to develop the platform for potential use in museums and other large-scale installations. “This platform is particularly suitable for museum applications such as the visit of long lost historical sites (e.g. Maya temples, Roman cities),” the team says. “Or the experience of seminal moments in a country’s history (e.g. famous battles, key political moments)”. The full white paper on the tech co-authored by Vincent Trouche, who was managing director at Kenzan Media Labs but is now co-director at Artanim Interactive, a spin-off of Artanim Foundation who will commercialise the idea, and Caecilia Charbonnier, can be read online here.

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z