Meta’s new finger tracking tech enables improved use of sign language in VR

[The continuing evolution of presence-evoking immersive technologies is enabling valuable use cases that enhance communication and users’ lives, as reported in this story from Futurism. See the original version for a different image, and for more details follow the link to the 14:28 minute video highlighted in the story. For more about Jenny and the Helping Hands community, see March 2022 coverage from Input.  –Matthew]

[Image: Source: Input]  

Sign Language in Virtual Reality Actually Looks Kinda Awesome

Meta’s new finger tracking tech is actually pretty impressive.

By Victor Tangermann
November 1, 2022

Virtual reality isn’t all about legless avatars and forgettable games.

Case in point, members of the deaf community are now taking advantage of modern headsets’ increasingly sophisticated hand tracking to use sign language in virtual reality.

It’s a compelling use case for the technology, allowing those who grew up with sign language as their first language, those who developed hearing problems later in life, and anyone else who speaks using gestures to communicate effortlessly inside virtual worlds.

Meta recently added hand tracking to its Quest 2 headset, allowing users to recreate sign language in a far more realistic manner. Using that tech, last week the popular VR app VRchat added finger tracking, a new layer of accessibility allowing users to use sign language inside the app, as seen in videos shared online.

A user who goes by Jenny showed off the feature in a video for UploadVR.

Jenny is part of Helping Hands, a VR sign language community comprised of 5,000 volunteers who teach others how to take advantage of the tech to keep in touch with others in the community virtually.

The Quest 2’s hand tracking is a big upgrade over Valve’s Index VR headset, which previously forced users to come up with alternatives to common sign language signs. For instance, as Jenny demonstrated in the UploadVR video, users used to have to turn their entire wrists around to mimic the crossing of two fingers.

But the Quest 2 relies on a suite of cameras, not sensors, to keep track of finger movements, allowing for much greater expression of dexterous motions.

“As you can see it might look a little bit more like fluidy and flowy,” she told UploadVR. “That’s because it’s actually cameras just looking at my real hands rather than a controller guessing and filling in the blanks.”

“It’s really amazing and honestly a little strange to experience after so many years of all of the controllers’ movements being quite rigid,” she added.

While it’s a big upgrade, there are still plenty of limitations to the technology. For instance, users can’t touch their virtual hands to their bodies, which is required to sign many words in American sign language.

But it’s a step forward, nonetheless, demonstrating that there are plenty of other great use cases for virtual reality — outside of attending dull boardroom meetings while embodying a cartoonish and legless avatar.

“I think one of the most important advantages of VRChat in general for anyone, but especially for deaf and hard of hearing individuals, is just connection with anyone around the world,” Jenny said in the video. “And that’s especially important for deaf and hard hearing people, because they are a lot more likely to experience isolation, which is something.”


Comments


Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News:



Archives