Meta Hyperscape lets users scan environments and then explore 3D 360 images of them in VR

[As it announced its Quest 3S mixed reality headset, a Meta blog post mentioned a potentially exciting new tool:

“Anything your computer can do, our MR headset can do better—thanks to the magic of presence. And speaking of presence, we’re improving that, too. We’re working to bring photorealistic spaces into the metaverse, enabling a profound new way to be together in spaces that look and feel like you’re physically there—we call it Hyperscape.”

The story below from Cointelegraph provides key details and links to a hands-on ZDNet report on the new tool; an entry in The Ghost Howls blog provides another hands-on report and many more details – an excerpt regarding some of the potential applications of Hyperscape follows below. A 19:19 minute video from YourLifeVR is available on YouTube. The demo version of the new app is available from Meta. –Matthew]

[Image: Source: “Reality in VR. Gaussian Splatting! – Meta Horizon Hyperscape” video via YouTube]

Meta shows off Web3-to-reality bridge with ‘Hyperscape’ metaverse demo

The tech is still experimental, but its implications could change how users view reality.

By Tristan Greene
October 3, 2024

Meta recently showed off a new “Hyperscape” tech that takes the idea of stitching photographs together to form a 3D environment — such as YouTube’s 360 videos — and turns it into a real-time rendering system that could potentially revolutionize telepresence and redefine the idea of working from home.

Hyperscape

Meta’s still as bullish as ever when it comes to the metaverse. As Cointelegraph recently reported, Meta CEO Mark Zuckerberg showed off the company’s new “Orion” smart glasses at the company’s “Connect” event on Sept. 25.

The Orion glasses purportedly give the user an effective heads-up display, allowing them to navigate the physical world with digital information seamlessly integrated into what they are seeing.

However, while the company’s new Orion spatial computing glasses may have gotten the most attention, its Hyperscape demo might be the most exciting update for those interested in both virtual reality and Web3.

Hyperscape, which is still experimental, would ultimately allow a person or machine to scan an area using a phone camera and then convert that imagery into a real-time-rendered, fully navigable digital environment.

One pundit who tried a demo of Hyperscape with Meta’s Quest 3 virtual reality headset described the experience as being like the “Holodeck” from the fictional Star Trek universe.

The demo is currently available to the general public, but it only allows users to visit a few different spaces that were pre-rendered using the technology.

Real-time telepresence

Future iterations of Hyperscape, however, could allow any observable environment to be rendered in the metaverse in real-time. This could make it possible for people attending a meeting in virtual reality to see and interact with those attending the meeting physically in real-time from an immersive perspective.

A decentralized version could allow geographically separated people to use similar technology to verify reality in real-time from a navigable perspective via the metaverse. This might prove far more immersive and socially binding than relying on pre-recorded or forced-perspective video footage to verify facts.

The advent of non-fungible tokens and the rising popularity of digital assets have made the metaverse possible, but arguably, its mainstream proliferation will require a bridge between Web3 and reality that incentivizes more than the financial possibilities.

[From The Ghost Howls]

Meta Hyperscape hands-on: A wonderful glimpse of the future

By Skarredghost
October 9, 2024

[snip]

I got pretty excited by Meta Hyperscape, so as an entrepreneur and developer I immediately started thinking about all the possibilities enabled by this technology. The first thing I thought is that this will disrupt virtual tours: if I were Matterport or another company making money with virtual tours, I would be concerned, because with Hyperscape, everyone with a phone (and some technical knowledge) could scan his own space and create a virtual tour about it. And this virtual tour would be fully 6dof: people can freely move inside it, feel the objects with the proper depth, and so on. It would not be just a set of 360 photo bubbles, but a full navigation in another space. I know that Hyperscape is in beta, but when it is finalized, this service will be perfect for visiting homes remotely.

The social aspect is also another thing that always fascinated me about reconstructed 3d spaces. I could scan my new home and invite my friends to visit it in VR. Or I could scan a space in a specific moment I want to remember (e.g. a portion of the restaurant of my wedding party) and I could revisit it alone or with my family whenever I want. I’m pretty sure that Meta is primarily interested in this and I see in it great potential: as today you shoot a picture to remember a moment, in the medium-long term future maybe you will be able to save a full environment.

The office of Mark Zuckerberg is a glimpse of that: in my life, probably I will never have the opportunity of being invited by Marky Z to his office, but with Hyperscape, I was there. It could also be a service offered by celebrities to let you visit their environments, their favorite spaces, or the location of their events…

Another idea could be creating VR games that are held in real spaces. Indie developers with limited budgets for 3D graphics may for instance scan their rooms (after they remove the pizzas and the Red Bulls :P) and set up an escape room or a point-and-click adventure there.

As the Varjo PR told me at AWE about their Teleport service, it can also be useful in the B2B sector: for instance, if a company has to set up a stage, or a booth, or something like that, they could do that, then scan the environment, and send it to the CEO to step in and verify if he likes it

I think this is an enabling technology that as soon as it is stable, will offer many new opportunities in the XR space. And don’t get me started on what it could be possible to do with interactions: if some of the scanned objects could be grabbed or interacted with (e.g. you can turn on the light), that would be amazing.

[snip to end]


Comments


Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News:



Archives