Tomorrow’s gadgets will have emotional intelligence

[From Computer World]

Elgan: When the iPhone feels your pain

Smartphones are smart, but tomorrow’s gadgets will have emotional intelligence

By Mike Elgan
January 17, 2011

Computerworld – We love our gadgets. But they treat us with an indifference that sometimes feels like contempt. They’re like cats.

But soon, they’ll act more like dogs — perceptive of how we feel, and reacting to our moods by joining in on our elation or treading lightly when we’re angry.

Such capabilities are nearly inevitable, either sooner or later, because the trajectory of interface design is always toward making machines increasingly “human-compatible,” which means they’ll interact with us like another human being would. And that requires some level of empathy.

In a CES 2011 keynote address last week, Samsung President B.K. Yoon said, “Digital humanism will characterize the new decade that has begun.” He said that “adding emotional value to digital technology” is central to Samsung’s mission. And they’re not the only ones.

MIT has a research division called the Affective Computing group that “aims to bridge the gap between computational systems and human emotions.”

One of the group’s more interesting projects resulted in a gadget called the Emotional Social Intelligence Prosthetic, or ESP — there’s no “I” in social intelligence, apparently. The purpose of the ESP is to inform the user about the emotional state of the person he or she is talking to.

The device is a tiny, handheld computer with special sensors and a camera. It looks for signs of boredom and other emotions, and informs the user so he can change the subject if he’s getting on the listener’s nerves.

Of course, the ESP project will never result in a product. But it’s an example of research devoted to the engineering problem of detecting human emotions with a handheld device.

Cambridge University researchers have developed a technology they call EmotionSense that uses both speech-recognition software and special sensors in the phone to figure out how the user is feeling.

Their goal is to develop a nonintrusive way to accurately gauge the emotional state of a person holding a smartphone.

Their initial aim is to use the technology for social science research. The idea is to find correlations between how a user feels and various locations, people or other factors that are linked to a particular state of mind. Not surprisingly, people tended to be “happy” at home and “sad” at work.

In an initial trial, the researchers found that the system was roughly 70% accurate in perceiving the emotional states reported by test subjects in a follow-up survey.

Although the initial aims are scientific, it’s clear that the first step in the development of an emotionally perceptive smartphone is accurate, nonintrusive sensors, which the Cambridge researchers are demonstrating.

Another, related project at Cambridge is looking at building emotional detection into GPS car navigation devices. Their vision is a dashboard GPS device that uses special sensors to monitor facial expressions, voice intonation and hand movements to perceive the emotions of the driver. For example, if the driver were stressed out, it could hold incoming calls, delay giving additional instructions or turn off the sound system. (Personally, I think a dashboard that did all this would actually make me angry.)

Research In Motion unveiled a concept design for a phone called the BlackBerry Empathy, which is based on a wireless “mood ring.” (No, I’m not making this up.)

The ring somehow gathers your biometric data, which is sent to the phone. The phone would also detect the emotions of people near you, and would let you know if they’re angry, sad or bored; it would do the same with your social networking contacts. The idea is that everyone would know via Facebook how everyone else was feeling, based on biometric data.

The German giant Siemens announced in September a phone called the CX70 EMOTY, which can communicate emotions via MMS text messages, according to the company.

The phone has sensors on the side of the handset that interpret stroking, shaking and pressing to control the emotional expression of an animated character on the screen, which can then be sent to someone else.

The applications that researchers, designers and technology dreamers envision for emotion detection technology, whether it’s friendlier GPS or automatic status updates, are largely irrelevant. What matters is the technology to detect emotion.

Once somebody builds a commercial smartphone that can figure out how you feel, any number of apps can be built to use that data.

Possible applications could include advice and suggestions from your phone to help you deal with your emotional state. For example, if your phone detected boredom, it could interrupt you with the latest Internet meme video. If you were feeling angry or worried or sad, it could display inspirational quotes or suggest some remedy, such as driving more carefully, going to a movie or suggesting “it might be better to not send that e-mail just now.”

Most likely, however, your iPhone of the future will use your emotional state to subtly change how it interacts with you, making it more intuitive and appealing to use.

Emotional smartphones and other gadgets are going to change how we interact with machines. They’ll know when you’re happy and when you’re sad, and will respond accordingly. How do you feel about that?

Mike Elgan writes about technology and tech culture. Contact and learn more about Mike at Elgan.com, or subscribe to his free e-mail newsletter, Mike’s List.

This entry was posted in Presence in the News. Bookmark the permalink. Both comments and trackbacks are currently closed.
  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z