Making Meta: Inside the tech giant’s Pittsburgh offices where virtual reality shapes its future

[This story from Pittsburgh Inno describes some of the technology Meta is using to develop more effective social presence illusions. See the original version for 12 more images, and for more Meta developments see a related story from NEXTpittsburgh, “Inside Pittsburgh Tech: Walking a Mile in the Ekto One Virtual Reality Boots” (which includes a 1:53 minute video). –Matthew]

[Image: Meta’s Reality Labs in Pittsburgh. Shown here is Mugsy, a system with 170 cameras and hundreds of lights in a sphere-like shape that’s nearly 10 feet tall. It’s used to recreate someone’s face with impressive detail. Credit: Nate Doughty]

Making Meta: Inside the tech giant’s Pittsburgh offices where virtual reality shapes its future

By Nate Doughty
August 24, 2022

Chuck Hoover is convinced that the future of Meta Platforms Inc. will continue to focus heavily on making the metaverse — a single and universally experienced virtual world connected to the internet that is viewed through virtual reality and augmented reality headsets.

As the general manager of Meta’s (NASDAQ: META) Pittsburgh offices, Hoover is tasked with making this virtual world a physical reality for Facebook’s parent company, which has reportedly invested tens of billions of dollars into metaverse-related developments over the past few years.

Those developments as it relates to the creation of true-to-life avatars that will someday make up the population of this online world are being spearheaded by about 300 Meta employees in Pittsburgh, where an array of custom-built video and audio capturing systems have been designed as part of Meta’s Great Human Survey. The survey, an initiative started back in 2017, looks to compile digital transformations of 10,000 people and has already transformed over 200 people in and around the Pittsburgh area for placement into Meta’s metaverse.

The company offered updates on this progress and other metaverse-related feats to members of the local media on Tuesday, the first public showing of its offices located near the border of the Strip District and downtown since the pandemic began.

“For a long time this research was really secret, it was really under wraps in the early days,” Hoover said. “I think more and more as we’ve advanced this, we really want to get the story out here of what this tech is, what we’re doing, and how this connects to the metaverse.”

A series of recent research breakthroughs at the company’s Pittsburgh office has led to the debuting of capturing systems that are capable of picking up facial details as fine as an individual hair and under a myriad of lighting conditions. Another system is able to record immersive audio experiences that aim to be played back in a way that is indecipherable to what was heard in reality and with respect to the original geospatial positioning of the audio’s original location in relation to the recording device.

All of these efforts have allowed Meta to show off avatars of humans that, with just a bit of forgiveness, look and sound true-to-life when seen in a VR headset.

While wearing such a headset, users can watch as the eyes of these digital humans follow their own eyes in a way that’s not unlike real eye-to-eye and face-to-face contact. Sound emitting from these avatars is fixed to them and as a human user moves around them, the sound moves around the user’s ears, too, just as it would if someone were to change their location in relation to someone else who is speaking.

“The fact that we now have early prototypes that actually work of being able to scan yourself and generate a lightweight avatar, that’s all stuff that’s come together in the last maybe nine months and has been really exciting,” Hoover said. “It’s cool to talk about the fact that we’re in Pittsburgh doing this stuff to kind of let everybody in the community know what we’re doing.”

Turning the physical into the virtual

To create its avatars, Meta has three capturing systems that aim to digitally transform different parts of the human body.

With Mugsy, Meta has assembled 170 cameras and hundreds of lights in a sphere-like shape that’s nearly 10 feet tall to recreate someone’s face with impressive detail. The capturing process, done while a human sits in a chair, takes about 90 minutes and the end result is an avatar that looks and reacts with facial features much like its human-based counterpart. When it’s done capturing, Meta said it amasses about 16 terabytes of data per person; that’s equivalent to about 8,000 hours of film.

“Mugsy is probably the most densely imaged volume in the world,” Yaser Sheikh, a director of Meta’s Reality Labs, said. “It captures the nuances of motion and our task is to overcome the most sophisticated perceptional system in the world, which is humans looking at other humans, especially people who they don’t know.”

There’s also Hearsay, which exists in a large room with microphones embedded into 108 ear pairings placed at different heights throughout the space as part of an effort that tries to capture sound as it’s heard by humans. Antiechoicon, another sound-based capturing system, is a sphere-shaped and sound-proofed anechoic chamber filled with hundreds of microphones that are used for other types of audio capture like signing or for the playing of instruments.

“What we call the metaverse, it is sort of one of the grand challenges of the next ten years, and I would say the social metaverse is the core part of it,” Sheikh said. “The mission of this lab is to enable remote interactions that are indistinguishable from in-person interactions. In other words, all of those social signals that we exchanged when we talk to one another, we want to be able to capture those, reproduce them, and allow people to connect and have relationships no matter where they are in the world.

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z