Emotion tech developer uses EEG to control computers

[From The Epoch Times]

[Image: Psychic Toys: Software developer Robert Oschler looks at his WowWee Rovio robot while wearing an Emotiv EPOC headset. Using the headset, he programmed the robot to be controlled by his thoughts. (Robert Oschler)]

Emotion Tech Developer Shares Dreams of Electric Sheep

On the tech frontier, boundaries between man and machine fade

By Joshua Philipp
Epoch Times Staff
Created: Dec 6, 2010 Last Updated: Dec 7, 2010

Few may remember the 1982 film, “Firefox” about the Soviet fighter jet controlled by the pilot’s mind. The film, starring Clint Eastwood, set out on a premise that technology using the mind as its commander would be a weapon so dangerous that “it would change the structure of our world.”

Or, as software developer Robert Oschler has discovered, you can use it to watch YouTube.

Oschler is a pioneer on an emerging frontier—a place where the boundaries between man and machine are less defined, and where technological boundaries fade. Using what is known as an electroencephalogram (EEG), he has replaced a keyboard and a mouse with his own thoughts and emerging notions.

His tool of choice is a consumer EEG headset, the Emotiv EPOC. The device sports 14 electrodes that wrap around the user’s head, a built-in gyroscope, and the ability to detect emotions.

His interest in the technology isn’t based in monetary gain. “It’s fascinating. That’s what it’s really about,” Oschler said, adding that it “can’t even be compared” to writing a Web app—both in terms of capabilities, and unrefined geek-style coolness.

“It opens up a whole new range of software that has never been possible, and a whole new way of working with computers that has never been possible before,” Oschler said.

Like in the book “Do Androids Dream of Electric Sheep?” that was re-imagined in the film “Blade Runner,” Oschler’s work has merged emotion into the computer system.

Oschler’s first program with EEG technology was a program to control a WowWee Rovio robot that he powered with the emotion of sadness. When it jolted forward, he was hooked. “It was really surreal,” he said.

He soon launched into his second project, EmoRate. The program syncs with YouTube’s answer to home television, Leanback. While watching videos, EmoRate will bookmark parts of the video based on the user’s emotional response.

The user can pull up a menu and see where he became the most happy, sad, angry, or frightened in the video, “and simply by smiling or thinking of something that makes me mad, it takes me right back to that part of the video and starts to play back right at that point in the video,” he said.

Android Future

The U.S. military has already begun looking into EEG technology. DARPA, sometimes referred to as the “mad scientist” branch of the Department of Defense, is working on everything from bomb-sniffing remote-controlled rats, to a monkey capable of controlling a robotic arm using only its thoughts.

They’re currently toying with the potential of using EEG systems with troops on the battlefield. The potential is vast—a single thought could pull a trigger, and a curiosity could spur a human-computer search for information that would be displayed instantaneously to an eye-piece worn by the soldier.

The technology would give immediate access to information, and eliminate the mediators between man and machine—turning the computer into an extension of the individual that could be controlled in the same way as their arms and legs.

It’s the cyborg future, where a user is wired to their mechanical extensions by the power of their own thoughts.

Such technology has been the premise of countless sci-fi films, from “Avatar,” to “Inception,” and “Tron.” Oschler notes that in the extended version of the film, “The Matrix,” an AI (Artificial Intelligence) researcher explains that once a computer system overcomes the senses and responds to the user’s body movements, a type of remote sensing sets in, and the virtual reality may begin to seem more real than this one.

For Oschler, however, the dystopian future isn’t much of a concern. For him, the technology is more human—closer to normal interaction than what’s currently being used. “Our minds are forced to work in a very alien manner with computers right now,” Oschler observed.

“The reality is we hate the data processing—despite the fact that it’s our strongest talent—because look at how eager we are to turn it over to computers,” Oschler said. “Now here comes the ability for us to turn all of that around.”

An example of this in action is a piece of software called “EmoLens” that Oschler built. It lets users search through images on Flickr using their emotions, rather than word searches. It’s designed to solve the “tip of the tongue” syndrome by filtering through images based on the user’s emotional reaction to them.

“Our minds start with feelings and then we get the words, because obviously as a baby you didn’t know words. Words came later,” he said. “So with our minds, we start with feelings, and then get the words. But with computers, you have to start with words.”

“What I’m doing with this software is I’m starting with a feeling,” he said.

Click here to see some of Oschler’s work.

ISPR Presence News

Search ISPR Presence News: