See-thru AR

[From Delta, the newspaper of Delft University of Technology in the Netherlands; more information is available

October 6, 2009 | Nadine Böke Aantal

On top of the world

It’s the latest craze in smartphone applications: augmented reality. Point your camera at a street and information about restaurants or other attractions are displayed on top of the video image of that particular street. But there is a similar technique that´s arguably even better: see-through augmented reality.

Imagine finding your way by car without having to look at the screen of your tomtom; instead, you’d just put on a special pair of driving glasses and see your route displayed before you on the road. This is one example that Jurjen Caarls, a PhD student, offers as [a] way to use see-through augmented reality (AR), a technique that allows us to look at the real world, but then with virtual information displayed on
top of it. Caarls has been working on a PhD project concerning augmented reality at the faculty of Applied Sciences’ Quantitative Image Group. For his project, he developed a prototype of a working see-through AR system, consisting of a headset, a pair of semi-transparent mirrors as glasses, and a portable laptop. Caarls will receive his PhD degree on September 25.

The biggest challenge Caarls faced in developing the system did not involve the hardware, as that already existed, but rather the positioning. Caarls: “For an AR-system to work properly, you need accurate information about the position of the device, or, in this case, the position of the eyes of the user. Only then is it possible to properly align the virtual world with the real world.” A whole range of
various types of sensors can be used to determine the position of an AR-device within a certain space.  Simple AR-applications on smartphones use a combination of GPS and magnetic sensors. “But these are not very reliable,” Caarls adds. “GPS information can be a few meters off, and the Earth’s magnetic field is easily distorted by metal objects.”

For his own AR-system, Caarls used a different set of position-measuring sensors: a camera and inertia sensors. Caarls: “The problem with cameras is that while they are capable of obtaining an accurate absolute position, they are rather slow. Inertia sensors are faster, but less accurate, as they only provide accelerations and speed of rotation. Ideally, then, you want to combine the two. We developed image processing techniques and filters that, when using the AR-device, allow for the absolute position to be determined with the camera and a single small visual marker. Information from the inertia sensors is then used to interpolate in-between camera measurements.”

One problem that Caarls however could not solve completely is the small time lag between what the user sees in the real world and the virtual projection. “It takes time to get information from the sensors, process it and generate a view of the virtual world,” Caarls explains. “Maybe just something like eighty milliseconds, but you will still notice this when using the device. So what we tried to do is not just
process the data at a given time, but to extrapolate it, thereby predicting what the position of the device will be in the near future. But that’s hard to do.” This time lag problem wouldn’t occur if, instead of semi-transparent glasses, non-transparent glasses are used that display a video of the surroundings on the inner side. This is called video see-through AR, and it is the technique used in smartphones. But this technique has other disadvantages, Caarls says: “Although with video see-through there is time to properly align the video-images of the real world with the virtual information, there will now be an unavoidable time lag between the movements people make and what they see. For some people, this causes headaches, dizziness and nausea. They basically get motion sickness.”

Even though Caarls’ AR-system prototype is now fully developed and functioning quite well, one shouldn’t expect to be able to wear such special driving glasses any time soon. “At the moment, the system is still quite expensive and large,” Caarls explains. “There needs to be commercial interest in it, so that the hardware will be further developed and the prices will drop. But the system is already being
used. A group called the AR-lab of the Royal Academy of Arts in The Hague uses the techniques I developed for AR-projects at museums like the Boijmans van Beuningen and Kröller Müller. A next step will probably be something like architects using optical see-through AR to show their designs to their clients. After that, the gaming industry might use the system. And then perhaps one day we will no longer use computer monitors, but rather simply put on our AR-glasses.”

Pose Estimation for Mobile Devices and Augmented Reality, Jurjen Caarls, promotion date: September
25, 2009.

ISPR Presence News

Search ISPR Presence News: