Intel moves to make 3D gesture control as pervasive as the mouse

[From ExtremeTech, where the story includes additional images. If nothing else, be sure to watch the frog video]

Perceptual computing graphic

Intel moves to make 3D gesture control as pervasive as the mouse

By David Cardinal on September 11, 2013

Gesture-control peripheral makers Leap Motion and Haptix have stolen the spotlight from traditional input devices over the past year with promising new devices and OEM design wins. A bit lost in the mix has been tech-giant Intel’s sizable investment in what it calls perceptual computing. At IDF 2013 this week, Intel unveiled its plans to make 3D input as common as the mouse and touch have become. Front and center was Creative’s new $200 Senz3D, which supports applications developed using Intel’s free perceptual computing SDK. In addition to demos of the Senz3D, Intel announced that its upcoming 14nm Broadwell chip — available for ultrabooks in late 2014 — will support integrating a 3D camera directly into the bezel of laptops.

Perceptual computing as the key to pervasive computing

First unveiled by Intel along with its perceptual SDK at IDF 2012, the company believes that what it calls perceptual computing will be as important and transforming as the mouse was 50 years ago. It contends that as computing becomes more pervasive — like the power grid is today — then “perceptual” interfaces will be required. Intel defines perceptual interfaces as being immersive, intuitive and natural. During a technical breakout session, Intel’s Director of Perceptual Computing, Dr. Achintya Bhowmik showed off this effective and amusing [0:27 minute] video of a frog playing a video game to illustrate the point.

For creating perceptual interfaces, Intel sees the need for both true 3D video and stereo audio inputs. Instead of the simple dual-camera and infrared LED setups found in inexpensive competing devices like the Leap Motion or Haptix, Intel’s design uses a 720p RGB camera from SoftKinetic that also captures time-of-flight-based depth information at the lower resolution of 320×240 pixels. The combination provides Kinect-like motion-and-gesture tracking using software also from SoftKinetic, although at the shorter, complementary distance range of six inches to three feet (15-90cm). Using time-of-flight for depth measurement makes the Senz3D much less sensitive to ambient light conditions than devices that rely on emitted infrared light.

Also unlike most of other gesture-based devices, Intel’s design includes stereo microphones, to allow for improved speech recognition and background noise rejection. Bhowmik showed some examples of how the combination of video and audio input can be used to improve speech recognition, which certainly convinced most of us in the audience. For now all this added functionality comes at a cost, with the Creative Senz3D priced at more than twice what the Leap is. Intel expects prices to drop dramatically when the technology is designed into laptops beginning in late 2014. Clearly that’s when Intel’s bet in this area is supposed to start paying off for the company. When I asked Intel’s perceptual computing marketing manager “what’s in it for Intel?” he minced no words in explaining that creating applications that needed processor cycles was in the company’s interest.

Hands-on shows that Intel’s design has paid off

Unlike my initial underwhelming experience using a Leap Motion, I found using a Senz3D to be quite straightforward. I was deflecting lasers in the sample shoot-em-up within seconds, and more or less able to keep a fighter plane from crashing in the combat flight simulator with no training. Like any “hand waving” interface, the issue of arm fatigue is a real one. I think developers are going to need to find a way to mix keyboard, mouse, touch and gesture input that combines some of the strengths of each, as relying solely on gestures just doesn’t seem practical or desirable in most cases.

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z