Realities.io’s jaw-droppingly detailed photogrammetric VR environments

[Here’s another clever technique for creating immersive, literally photo-realistic presence-evoking virtual environments; the story from Road to VR includes more images, and a 3:06 minute video titled “This is Not a Live Action Video, This is a VR Scene from Realities.io” is available on YouTube. For more coverage, see the Just A/VR Show. –Matthew]

realities.io photogrammatry VR

Inside ‘Realities’ Jaw-droppingly Detailed Photogrammetric VR Environments

By Ben Lang – Mar 11, 2016

Realities wants to transport viewers to exciting places across the globe in ways that are much more immersive than the single, flat vantage point of a 360 degree photo or video. The company is creating an efficient photogrammetry pipeline to build sometimes frighteningly realistic virtual environments that users can actually walk through.

I recently visited the Realities team to check out their very latest work. After donning a Vive Pre headset and playing around for a solid 10 or 15 minutes, I was thoroughly impressed… and I hadn’t even left the menu.

No One Said Menus in VR Have to Be Boring

Realities wants to capture exciting places around the world and allow anyone to visit. Encapsulating this sense of global teleportation, the developers have created a menu which takes the form of a detailed model of the Earth floating amidst a sea of stars and our nearby Sun. I could have clicked on one of the nodes on the globe to teleport to the corresponding scene, but I was far too fascinated with the planet itself.

Starting at about the size of the beach ball, the Earth in front of me looked like one of those amazing NASA photos showing the entire sphere of the planet within the frame. There’s a good reason it looked that way: it was constructed using assets from NASA, including real cloud imagery and accurate elevation data for the Earth’s continents (which was exaggerated to allow users to make out the detailed terrain from their satellite vantage point). The dark side of the globe also accurately lights up with a brilliant, warm grid-like glow of city-light scattered across the planet’s surface.

With the Vive’s controllers I was able to reach out and manipulate the globe by spinning it, moving it, and doing a pinch-zoom gesture to make it bigger or smaller. I grabbed the Earth and lifted it over my head to look at it from below; for the first time in my life, I truly got a sense of the way in which South America really curves down under the Earth as it reaches toward Antarctica. I’d never seen—or I suppose ‘felt’—the shape of the continent in that way before, not even when playing with one of those elementary school globes in the real world.

I went looking for Hawaii, where I once lived for a year. After spotting the lonesome island chain in the middle of the Pacific I wanted to zoom right down to street-level to see my stomping grounds. When I couldn’t zoom in much further than the ‘state view’, I had to remind myself that this was, after all, just the menu. At this point I realized I’d been playing around with it for 10 or 15 minutes and hadn’t even stepped inside Realities’ actual product.

So I swung the globe around to find a highlighted node in Europe (where the Realities devs are from, and many of their early captured scenes) and clicked to teleport.

Stepping Inside a Virtually Real Virtual Environment

When the scene faded in, I found myself in a small, dilapidated auditorium. Now, this would have been a ‘neat’ place to see in the form of a high resolution 360 photo. But with Realities, the scene around me not only had depth, it was actually made of real 3D geometry and was fully navigable, both by physically walking around the Vive’s room scale space or pointing with the controller and clicking to blink from one spot to the next. There’s a massive gulf in immersion between just seeing a space like this with a 360 photo compared to being inside a detailed recreation where I can walk around and kneel down to see the dust on the floor.

For the most part, the scene around me was more detailed and realistic looking than any other virtual world I’ve been inside. This is thanks to the photogrammetric technique that Realities is using to capture these spaces, which involves the use of high resolution photography to recreate a very accurate model of the real space.

The developers joked that they get to “outsource the lightning engine to the real world,” since the scene is lit as desired in real life before capture. With photogrammetry, the real world also handles the jobs of texture artist and 3D modeler (with a little help from someone like Realities who has to find a way to pull it all into the computer).

The results are only as good as the pipeline though; Realities have made big improvements since the last time we saw their work. Inside the crumbling auditorium I could see the debris scattered across the room with detail that would be simply unfathomable to craft by hand in a 3D modeling program in the time it took Realities to capture the space.

Even the most minute details were captured and shown as fully realized geometry. As I approached the walls I could see the flecks of paint curling and flaking off after decades of neglect. The graffiti-covered walls revealed individual strokes of paint up close, and scattered about the debris I spotted the discarded yellow cap of a can of spray paint. The developers told me people often find tiny details—like a pack of cigarettes and a lighter—that they themselves haven’t spotted before. No joke… a scavenger hunt in a space like this would probably be a lot of fun.

In some areas where adequate data wasn’t captured (like in tight nooks where it’s hard to fit a camera) the geometry will look sort of like a janky 3D capture form the original Kinect, with stretched textures and unearthly geometry, especially on very fine edges. Some of this is hidden by the fact that real space I was looking at is quite derelict, so it’s already a chaotic mess in the first place. Still, the Realities team has shown that wherever they can capture adequate data, they are capable of spitting out richly detailed geometry on both the macro and micro scales.

I should be a bit more clear about the locomotion method employed by Realities. It isn’t a ‘blink’ as much as a ‘zoom’. When you point and click, you traverse from A to B almost, but not quite, instantly. You still see the world move around you, but it’s very fast. So fast, the developers tell me, it’s beyond normal human reaction time. So while you still see it, by the time you can react to it, you have reached your destination. That’s the hypothesis at least, and in my testing it seems to have the benefits of nausea-free blinking, but helps you keep you better oriented between where you are and where you came from, instead of just teleporting instantly from A to B.

Making the Virtual Real World More Real

Since the lighting of the scene is ‘baked’ into the photo-based textures, the raw capture won’t include volumetric light or specular lighting. However, the team has shown how they can layer in some of these effects to make the scenes even more realistic.

In another space that I saw—an abandoned room with an old grand piano at the center—the team had added volumetric light rays streaming through the high windows along the side of the room. Following the beams from the window to their landing point on the opposite wall, they reconnected with the baked lightning from the real scene in a convincing way. With the rays illuminating the (added) dust particles floating about, the decrepit space was yet more convincing.

Another scene I got to see was an abandoned shower and bathing space from an old hospital. Ceramic tiles covered the space from floor to vaulted ceiling. Instead of the walls growing flatter and less real as I approached (as you might expect in a videogame), getting closer actually revealed them to be even more real than they looked from afar, thanks to the team’s photogrammetry process which was accurate enough to capture the mere millimeters of elevation change between the tiles themselves and the sunken grout lines between them.

The realism of the tiles is further enhanced because the Realities team added in-engine specular lightning, showing reflective highlights on the semi-reflective tiles which moves appropriately as you move your head throughout the scene, thus eliminating the baked lightning effect. This is especially impressive up close and at extreme angles where usual techniques used in many non-VR experiences (like bump-mapping) would break down.

Go to Places You Can’t Go

So Realities can capture impressive looking real-world spaces and bring them into VR. Now what?

The company’s current approach is to capture amazing places around the world and allow people to visit as virtual tourists, with a bit of an educational bent. Some of the early user interactions within these spaces involves finding photos positioned throughout the area which, when retrieved, reveal interesting information about the space. The team especially wants to capture places people aren’t normally allowed to go. For instance, all of the scenes I described in the abandoned building were of a place the public isn’t actually allowed.

It’s a compelling idea, especially given the rate at which the team’s photogrammetry technique is improving. I imagine it would be awesome to step inside the White House’s Oval Office, but even if I was allowed to do so, it might not be compelling enough for me personally to warrant a trip to Washington D.C. But it’s almost certainly compelling enough to warrant merely putting on a VR headset if it means I can see a near-identical recreation. And there’s no doubt that the recreated scenes would be accurate enough to almost entirely replicate the experience of walking around an art museum.

As a gamer at heart however, I was definitely thinking of ways these spaces might be used for more-action oriented interactive experiences. With the team already rendering these scenes in real-time in UE4—even with VR’s high end processing demands—it seems like that should be possible. Anyone up for team deathmatch in the Oval Office?

Realities is planning to launch a version of their photogrammetry captures on SteamVR for free. Timing on that release is not yet determined, but the company will be showcasing their work at GDC this coming week at the Graphine booth (Main Hall, booth 429).

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z