[This Hackster.IO report describes a new system developed at Northwestern University that could substantially increase the use of full-body tracking to generate “intuitive and immersive” (presence) experiences for technology users. See the original story for three different images and a 31 second video, and find more information and a 2:34 minute video (also available on YouTube) in coverage from Northwestern. –Matthew]
[Image: Side-by-side image of a real-life scenario next to the motion-capture video generated by MobilePoser. Credit: Karan Ahuja/Northwestern University. Source: The Engineer]
Tracking Movement on the Move
MobilePoser uses AI and the sensors already present in our wearable devices to create a low-cost and portable full-body tracking system.
By Nick Bild
October 17, 2024
The next frontier in intuitive and immersive user interface design might involve the use of body tracking technologies. Using these systems, individuals can control their electronic devices with hand gestures or other natural movements. Body tracking devices also have the ability to accurately position an individual’s body within a virtual environment, greatly enhancing the illusion of reality. But while many body tracking systems have been developed to date — with some of them being quite good — they are still not widely used.
If this technology is so promising, then why is hardly anyone using it on a regular basis? The answer boils down to two primary factors — cost and convenience. High-end motion capture systems can easily cost upwards of $100,000, which is not exactly what most people have in their budget for a virtual reality or smart home setup. Moreover, body tracking typically requires either an array of worn sensors, which is quite cumbersome and hardly desirable in public spaces, or fixed cameras that limit the use of the system to a small area and raise some privacy-related concerns.
Whichever way you go, mobility is not really an option. Yet today, most of our digital lives are spent on the go. So to increase adoption, mobile-friendly body tracking solutions are needed. And of course, bringing costs down will also serve to get this technology into more people’s hands. An idea out of the University of Chicago and Northwestern University looks like it might check both of these boxes — and it appears to work quite well to boot. The research group came up with a way to leverage the sensors that are already present in smartphones, smart watches, and wireless earbuds to track the position of the user’s body.
Their approach, called MobilePoser, takes advantage of the fact that many people now regularly use more than one wearable device equipped with an inertial measurement unit (IMU). When worn, these sensors provide detailed information about the movements and orientation of various parts of the body. The only problem is that the IMUs found in consumer electronics are not exact enough to accurately determine one’s body position.
To overcome this, the team developed a deep neural network that was trained to predict one’s full-body pose from the measurements captured by one or more worn IMUs. This neural network was trained on a large dataset of synthetic IMU measurements generated from high-quality motion capture data. There can still be some ambiguity in the predictions, however, so a physics-based optimization step was added to filter the algorithm’s results. This filtering ensures that predictions exhibit spatiotemporal consistency and plausible kinematics.
The MobilePoser processing pipeline was designed to run entirely on-device to ensure that the system is truly portable. It was noted that on an Apple iPhone 15 Pro, the tracking system is capable of running at 60 frames per second.
A series of experiments revealed that MobilePoser, on average, has a tracking error of 8 to 10 centimeters. This is somewhat more than alternative options, like the Microsoft Kinect, which has a tracking error of 4 to 5 centimeters. But of course that increase in error also comes with total portability and there is no new hardware to buy, so that may be an acceptable trade-off for many use cases.
Looking ahead, the researchers are hoping to address a few issues with their approach. One notable problem is that the positions of worn devices can shift somewhat over time, which causes drift in the measurements and leads to inaccuracies. They suggest that additional sensor data, like GPS, UWB, or visual odometry, could help to correct for this drift. In any case, using the devices that we already own to create a full-body tracking system is quite an appealing idea, so we hope to see MobilePoser improve in the future.
Leave a Reply