Lytro Immerge: Groundbreaking camera will let you move around in VR video

[This new camera and production system offers the promise of what Lytro calls “true live-action presence in VR” (in Mashable); the story below is from Wired, where it includes more images. A 3:43 minute video introducing Lytro’s Immerge is available on Vimeo; and see a story about Uncorporeal’s related technology in ISPR Presence News. –Matthew]

Lytro Immerge camera in woods

Lytro Immerge: Groundbreaking Camera Will Let You Move Around in VR Video

Tim Moynihan
11.05.15

Ever since Lytro burst onto the scene with its first light-field camera three years ago, it’s done things very differently. The company’s imaging technology has always been groundbreaking: They make still cameras that allow you to capture a shot, then decide where to focus and how to adjust the depth-of-field after the fact. All of a sudden, “fix it in post” applied to things that were never before possible in still photography—with a single camera, at least.

But Lytro’s first two cameras weren’t smash hits, for good reason. The company’s first consumer product, a little kaleidoscope-looking shooter with basic controls and a tiny viewfinder screen, was extremely limited for its $400-to-$500 price. Its second camera, the Lytro Illum, looked more like a DSLR and offered deeper controls. It also cost $1,600 and stumbled in key areas of usability. Neither camera could capture high-resolution images, and neither captured great image quality despite those innovative refocusing tricks.

They didn’t shoot video, either, but Lytro always teased that similar focus-after-the-shot features were possible with moving images, too—a Lytro camera that shot video was inevitable, really. Now it’s here, and it’s an entirely different beast than what most expected. Lytro has set its sights on capturing VR video, and the company is announcing a new end-to-end system that could radically change the possibilities for VR viewers and filmmakers alike.

The Lytro Immerge system has been in development for about a year and a half. It’s not for average consumers; it’s the first Lytro product intended for professional video production. And it’s not just a VR camera rig made up of video-shootin’ Lytros, either: Lytro CEO Jason Rosenthal says the sensors and the system were designed completely from scratch. Its centerpiece is the Immerge camera, a five-ring globe that captures what Lytro is calling a “light-field volume.” Unlike existing VR camera rigs, which capture a static 360-degree image, the Immerge camera will let you move around a bit within the scene.

“Imagine a camera staying stationary, but being able to move your head around and getting further and closer away from an object in a scene,” Rosenthal explains. “Having the reflection and the light rays adjusting accordingly. What the light field volume represents is, we’re densely capturing all the rays in a given geometric volume, and then we’ve built software that lets us play back those rays at very high frame rates and at high resolution. It gives you the perfect recreation of the actual world you’re capturing.”

That capability alone is a mind-boggling leap for VR filmmaking, and the Immerge announcement is timed well in preparation for the next generation of VR hardware. Positional-tracking headsets such as the Oculus Rift, HTC Vive, Sony PlayStation VR are all slated for release next year. Rosenthal says Lytro’s video will be compatible with all those platforms.

Until now, those upcoming positional-tracking headsets have suggested a big, new leap forward for immersive gaming, but not necessarily for video: With a completely computer-generated environment, players can actually move around within it. But when its footage is viewed with a positional-tracking headset, Immerge promises similar freedom of movement to live-action video—albeit a very limited range of movement in this first-generation product.

“Each one of the layers [of the camera] represents a very densely packed wave of light-field sensors,” Rosenthal explains, saying that in a single rig, there will be hundreds of individual sensors capturing the light-field volume. “It’s a 360-degree light-field capture. And then by stacking five on top of one another, we’re giving you a capture of the full light-field volume. You’ll get roughly a cubic meter of area that you’ll be able to fully capture the set of rays within that. And then, from a consumer experience, you’ll be able to move around within that volume and have the world react accordingly.

“We can and will build bigger spheres and other configurations for different types of shooting situations. You can go outside the ring, but what you’ll see is kind of a gradual degrading of the experience. You’ll start to notice gaps where we might not have the image data to fully recreate the scene … These are still things that we’re refining, but when you use an iPhone and you get to the end of the scroll, you get that kind of rubber-banding experience. You could imagine something similar in a VR experience, where you get to the edge of what a camera can do, your environment stretches a little bit.”

Lytro intends Immerge to be an end-to-end workflow system. Forget about popping a high-capacity SD card into this thing: To save ridiculous amounts of high-speed, high-resolution data shuttling from those hundreds of sensors as it’s being captured, the Immerge system comes with its own stack of servers. Rosenthal says each holds about an hour of 360-degree light-field video.

Because it’s for professionals, it’s designed to work with production tools like Nuke, Final Cut Pro, Adobe Premiere, and Avid Media Composer. There will be plug-ins for those programs and other industry-standard software that support importing and working with Immerge video.

“Right now, content creators and filmmakers are creating their own tools and technology, because none of the production workflow from the hardware to the software actually exists,” Rosenthal says. “And that just sucks up a huge amount of time and cost and energy that they’d rather be investing in telling great stories.”

There’s another huge difference in directing VR video. VR camera rigs capture video in all directions, so the crew can’t stand on the set without being part of the show. Because of that, the Immerge system is designed to be operated remotely via a tablet or phone, so that cameramen can make adjustments without being in the scene. And the camera-control software is designed to eliminate a huge learning curve, as it’s modeled after the control schemes for professional-level Arri, RED, and Sony cameras.

“It’s very much this system that will capture, edit, and process once, but it’ll render and play back in any device,” Rosenthal says. “This is something that we’ve heard a lot of excitement about from our partners. Today, they have to go through a very different production workflow depending on what device they’re building for.”

Those partners include some heavy hitters in the world of VR. Lytro says the system was developed with input from seasoned VR production companies Vrse, Wevr, and Felix & Paul. Rosenthal says that in the early months of 2016, you’ll be able to watch content shot by those production companies with the Immerge system.

Don’t hold your breath when it comes to being able to shoot light-field VR video with your phone or a consumer camera. The Immerge system is cost-prohibitive and bulky; Rosenthal says it’s designed to be mounted on a tripod or a dolly rather than operated as a handheld unit. And it’s too heavy for a drone. Buying it will cost several hundred thousands of dollars when it’s available in the first half of 2016. In the world of filmmaking, most professional equipment is rented, and Rosenthal says rental plans for a few thousand dollars a day are in the works.

All this is just stage one. Rosenthal says even more amazing applications are planned for Immerge’s future: The ability to refocus within a scene based on a headset’s eye-tracking, instantaneous 3D-modeling, and realistically blending CG content with live video.

“We’re capturing all the depth and the 3D geometry of the real world, so compositing computer-generated objects into that with the right depth and the right shading and shadows and lighting, that all becomes much easier than it’s ever been before,” Rosenthal explains. “Imagine a video game where instead of having the weird motion-captured, CG-rendered people, you can actually have photo-realistic people. That starts to become possible.”


Comments


Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News:



Archives