Emteq uses biometrics to move VR to its fourth generation

[This story from Digital Trends suggests an evolution of VR toward deeper social presence experiences based on biometrics. See also the recent ISPR Presence News post about the Veeso headset, which uses cameras to measure facial expressions. –Matthew]

FaceTeq headset

[Image: Source: Road to VR]

Emteq wants to track your facial and eye movements for emotional interaction in VR

By Kevin Parrish — September 9, 2016

In June, a company called Emteq came out of stealth mode and revealed that it’s working on a solution that will enable emotional interaction within virtual reality. Right now we can interact with virtual objects, NPCs, and the surrounding artificial environment by using our head movements and hands. That’s what the company considers as first- and second-generation VR, with the third-generation consisting of eye-tracking technology. After that, the company is hoping interaction through facial expressions will move the VR industry into its fourth generation.

Emteq’s new system is called FaceTeq, a facial sensing platform that tracks facial gestures and biometric responses. This platform detects the user’s electrical muscle activity, eye movement, heart rate and rate variability, head position, and his/her response to stress. The AI-powered FaceTeq engine grabs all of this information 1,000 times per second and translates it all in real-time, determining the user’s emotional state and physical expression.

“Our facial expressions are at the core of our social interactions, enabling us to silently, and instantly communicate our feelings, desires and intentions,” the company states. “Often called an empathy machine, VR represents a new paradigm in Human Computer Interaction; a naturalistic interface using physical motion. But the empathy machine needs emotional input, and FaceTeq provides the solution.”

EmTeq reportedly wants to use this technology to grow the social VR space. The biometric sensors used by the FaceTeq platform would be installed in the faceplate of a VR headset such as the Oculus Rift or HTC Vive. This would be far superior than using mere cameras, as the sensors would pick up on every frown, every facial twitch, every slight eye movement, and so on, accurately depicting the user in the virtual realm.

“Imagine playing an immersive role-playing game where you need to emotionally engage with a character to progress,” said Graeme Cox, chief executive and co-founder of Emteq. “With our technology, that is possible — it literally enables your computer to know when you are having a bad day or when you are tired.”

The company believes that facial expressions are a big component that’s missing in VR-based interactions. And while there are head-mounted displays (HMDs) with connected depth cameras to capture the lower portion of the user’s face, this method doesn’t track two of the most important facial details: the eyes. However, it’s those muscles surrounding the eyes that help visually define surprise, anger, and sadness. That’s the area currently covered up by HMDs.

Right now the company is aiming FaceTeq at developers, creative agencies, brands, market researchers, innovators, and health care professionals, meaning the technology isn’t just meant for social VR interaction. Researchers could use the technology to understand “the basis of our psychological and emotional responses to the world” whereas brands could use FaceTeq to generate an emotional connection with their audience.

That all said, FaceTeq is an open platform that’s low-cost and light-weight. The company believes this platform will “herald” the fourth-generation of VR, so we’ll see what the early adopters will produce. These early FaceTeq users are expected to consist of researchers, developers, and market researchers. Meanwhile, the company is currently working to partner with headset manufacturers and content creators, so stay tuned.

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z