Sensors that understand facial expressions, ‘virtual reality’ screens: Google’s Android boss reveals the phone of the future

[From The Daily Mail; see the video interview at CNN International; a related interview excerpt from 2012 follows]

Android phone

Sensors that can understand your facial expressions and ‘virtual reality’ screens: Google’s Android boss reveals the phone of the future

  • Matias Duarte said making world more ‘tangible’ is next step
  • Team also working on Google’s Glass wearable computer
  • Computers must work ‘as people expect not the other way round’

By Steve Robson
Published: 25 February 2013 | Updated: 26 February 2013

The mobile phones of the future will recognise our faces and hands and create a world that is like virtual reality.

These were the claims of Matias Duarte, Google’s Director of Android User Experience talking at the Mobile World Congress in Barcelona.

He said the ability to to make objects more ‘tangible’ will be the next major development in mobile technology.

Computers have to work the way people expect and not the other way round,’ he told CNN.

‘I want everything can you touch on the screen to operate like objects in the real world do.

‘That doesn’t mean they have to look like copies of objects in the real world, but they have to be tangible and physical and delightful.’

Duarte said users’ experience of mobile phones has been transformed by being able to ‘stroke’ screens.

‘In the old days we used to be poking at phones,’ he said. ‘If I were to start poking you, you wouldn’t like it, but when you start stroking, it’s a totally different message.

‘Right now we only recognize a couple of fingers, and on screens that are small and always in the palm of your hand.

‘In the future, we will look at the gestures of your entire body, facial expressions, arms, all of the fingers that you have, and you’re going to have screens not just in the palm of your hand, but all around you.’

Duarte’s team is also believed to be involved in the development of Google’s Glass wearable computer.

It will go on sale later this year for under £100, and will offer a way to give consumers the ‘virtual reality’ experience Duarte referred to.

Google’s Android software was used on almost 70 per cent of mobile phones and tablets sold in 2012.

But Mozilla, makers of the popular Internet software Firefox, is preparing to take on Google, and Apple with its iOS software, in the summer.

A new Firefox operating system for mobile devices has been backed by 13 wireless service providers around the world including Spain’s Telefonica, China Unicom and America Movil.

—–

[From Gizmodo]

Matias Duarte Q&A: Jelly Bean, the Nexus 7, and the Wild, Weird World of Android

Brent Rose
Jun 29, 2012

Matias Duarte is the Director of Android User Experience at Google, which means he’s the artist who pretties up the green robot’s gears. We got some one-on-one time with him at Google I/O, and he opened up about the Nexus 7, Jelly Bean, and why we shouldn’t be so huffy about Android fragmentation.

Giz: What is your overarching vision for where Android is heading in terms of UI and design?

MD: Well, I have a particular style in the way that I design things. Fundamentally, I believe that people want to touch and interact with things like they do with things in the real world, and that every time of interaction experience can be a delightful experience as well as an easy experience. A lot of people see those two things as being at odds, but I really like to try to find design solutions that blend the two.

I like to say that things are more fun than buttons. And you can solve most design problems by turning them into tangible, physical objects, without necessarily making them skeuomorphic copies of real world objects, but still tapping into those parts of the human brain and the human heart that are like, “I’ve got hands. I was built for picking up and moving things and swiping things around.” So I want to transform the types of interactions we have with computers that are today really all about hunting and pecking and picking and menus, into an experience that is a much more gestural, physical, emotional experience. And transform the entire OS like that.

We’ve made huge strides. We started in Ice Cream Sandwich [ed. Android 4.0], I think, to kind of set that vision to make an operation system that was really beautiful, and easy to use, and powerful. In Jelly Bean [ed. Android 4.1] we’ve taken one tiny next step, but I’m very happy with the improvements we’ve made in performance because that helps create that entire illusion.

Giz: Ice Cream Sandwich was very well received; what were the first changes you wanted to make in the update to Jelly Bean?

MD: Well, one of the biggest things was Project Butter, where we declared a war on jankiness and crustiness and graininess, because that’s so important. If you really want to create something that’s beautiful and emotionally engaging and feels like a real thing, you need to be fast enough. Right? You just need to get to the point where you’re fooling the eye and fooling the finger. So that was one of the top priorities right from the very beginning.

And then of course we wanted to continue to improve all of the other aspects of the OS. Things that seem small, like the expanded notifications, have a big impact because now all of a sudden every application can reach out and talk to you in a a way that is less intrusive. When I see I have mail now, I know who’s actually mailing me. Can I deal with it later, or do I actually have to switch apps? There’s a big cost to switching apps. We make multitasking really easy on Android, but still it’s much easier to just peek open that shade, be like, “Nah, I don’t need to deal with that,” and swipe it closed again.

Giz: The version of Jelly Bean that we have right now is a developer preview, right?

MD: Yes, it’s a developer preview so it still has a few bugs that we know about, and obviously a few that we don’t know about, but when the devices actually begin shipping in a few weeks you’ll see a lot of additional polish go in.

Giz: Is there anything that wasn’t announced that’s going to be going in when it launches?

MD: Surprise features?

Giz: Yeah.

MD: Well, if we tell you now then it wouldn’t be a surprise.

[snip to end]


Comments

One response to “Sensors that understand facial expressions, ‘virtual reality’ screens: Google’s Android boss reveals the phone of the future”


  1. Adrian P

    I think that this is a great idea to immerse the user even more deeply in the virtual world of their phone. The CNN interview I watched with Matias Duarte was very eye-opening. I also believe that people want to touch and interact with things like they do with things in the real world as it can make finding things on your phone alot simpler.This is a very ambitious move for the company as Google’s Android software was used on almost 70 per cent of mobile phones and tablets sold in 2012. They might be doing this as a response to the new Firefox operating system that is going to be released for mobile devices. They might also use this new business strategy to increase the declining android phone sales as most people are buying iphones. Either way it will be interesting to see the final version of it when it will be released.

Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News:



Archives