VR powered by our minds? ’10 years from now, it will seem obvious’

[Despite arguments about feasibility and the too-often unmentioned ethical issues it raises, the prospect of thought-based interactions with and in virtual environments is provocative. This story is from The Guardian, where it features another image and related links. –Matthew]

EEG-based interactions in VR (woman wearing headset)

[Image:  Brainwave-reading technology could eliminate the need for a controller when using a virtual reality headset. Photograph: Andre Kosters/EPA.]

Virtual reality powered by … our minds? ’10 years from now, it will seem obvious’

EyeMynd wants to create a system in which thoughts, not handsets, will control our actions in virtual reality worlds

Dan Raile in San Francisco
Last modified on Monday 14 November 2016

Dan Cook first began exploring the commercial potential of brainwave technology 20 years ago, working with a government agency interested in developing better lie detectors and pharmaceutical companies who wanted to understand the neurological impact of their drugs.

Back in 1993, virtual reality was a hobby for eccentric geeks and, thanks to the dystopian futurism of The Lawnmower Man, a Hollywood novelty. But Cook had a vision to develop animated human avatars powered directly by signals from the human brain, and left his postgraduate work in cognitive neuroscience at UC San Diego for the commercial frontier.

As virtual reality edges towards the mainstream, Cook believes now is the time for brainwave-reading technology to prove itself. His company EyeMynd wants to create an operating system that will allow brainwaves to be translated into actions in virtual reality worlds. No handset, no controller – just thoughts, directing users through virtual worlds.

“Ten years from now, this will seem obvious,” says Cook, who speaks with both deep sincerity and gleeful enthusiasm about the power of his invention. “Computers are becoming fast enough that we can detect and interpret all the signals of the brain in real time … We know how to pick up on all the signals the brain sends to the body, all the information of your senses, total cognitive and emotional tracking.”

The EyeMynd team, which has offices in Salt Lake City and San Francisco, includes Cook’s brother Nate (a postgraduate in spiritual psychology) and David Traub (the VR consultant for The Lawnmower Man) as executive producer. In spring 2017 it plans to launch a headpiece that uses 16 electroencephalography, or EEG, sensors to monitor brainwaves. The Developer Brainwave VR headset will be compatible with HTC Vive, one of the major VR headsets, and work with EyeMynd’s Brainwave OS operating system to translate the headset’s EEG readings into computer commands.

The first version of the headset will be launched for developers, giving them a toolkit to design applications using its brain-reading capabilities. Cook would not say how much the device would cost or give details about launch date or availability, but claimed it would be the most comfortable device of its kind on the market.

The headset also ships with a simple game called “Smile with Lucy”, which acts as a personalized brain-calibration tutorial. The calibration process used to take an hour, Cook says, but should soon take three minutes. In the game, the player mimics the facial expressions of an avatar while EyeMynd’s software monitors the unique brainwave patterns of the player – the minute “pattern recognition” signals generated by our brains when we see, feel, touch or move something.

EyeMynd hopes to take advantage of the growing consumer interest in VR accessories. But a decade from now, Cook thinks that it will be standard to use computer interfaces that require nothing more than brain sensors, eliminating the kinds of VR accessories used now such as motion sensors, hand-controllers, head-mounted accelerometers and camera rigs.

“The key to understanding our brainwave operating system is to think about dreaming,” he said. “In a dream you can run around without moving your physical legs. That dreaming and imagining creates brain signals that we can read. With what we want to do, you won’t need eyeballs to see, or ears to hear, or hands and feet. We can bypass all of that.”

EyeMynd is not alone in the quest to bring brainwave-based technology to market. Companies such as Emotiv and NeuroSky have released EEG headsets, intended for scientific use. Others are pursuing marketing and advertising applications, and some analytics companies already offer analysis of data from VR sensors as a way of tracking the response to ads. Directly tracking a user’s subconscious physical and emotional responses to campaigns could be an advertising holy grail, if indeed it can be achieved.

“The VR content providers and people in this new space have … no reliable way to tell if they are connecting with users,” said Charles Miller of yotta.io, a New Orleans-based company that has also developed brainwave sensors. “We’re focusing a lot on the marketing space – giving true hard numbers and quantifying metrics.”

Cook and his team, however, hope to apply their brainwave technology to healthcare and education. But there is plenty of room for skepticism about the immediate future of commercial brainwave technology, whatever the sector. Most neuroscientists say that the brain’s electrical signalling can be “read” with some accuracy in the lab, but only with some level of invasive surgery.

“It’s conceptually trivial but just about impossible to do,” said Jack Gallant, head of UC Berkeley’s Gallant Neuroscience Lab, who says the process also involves vast computing power and is prohibitively expensive in both time and money. “The problem with decoding EEG signals from outside the brain is that the skull is a horrible filter. It’s a bridge too far in my experience.”

Dan Cook, however, is not going to let skepticism stand in the way of his 20- year quest. He spent most of 2016 working on funding and manufacturing partnerships, and plans to open an office in China in 2017, shipping the first version of its headset to developers in the spring.

Cook added that Facebook CEO Mark Zuckerberg has already publicly announced his vision of brain-to-brain communication. “Zuckerberg understands that this is the future, but I don’t think he understands how close it is. He’s talking in years but we are talking months.”

Virtual reality simulations, Cook says, are just the start of a journey in which human beings discover that our whole experience is a simulation. What might seem like a computer scientists’ fantasy is fast becoming a fashionable theory among influential figures in the tech industry – including Tesla CEO Elon Musk.

“We are two levels down in a simulation within a simulation that is hosted in our brains,” says Cook. “This will become obvious in the future. There are no creators higher than us, but we are two levels down pretending to be these clueless mortals who get sick and suffer.

“Virtual reality allows us to masterfully create illusions for the purpose of pursuing and enjoying utterly unique human experiences,” he said. “This offers humanity a chance to truly understand itself.”

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z