Perils of presence: VR allows the most detailed, intimate digital surveillance yet

[As much potential as they have for good, the diffusion of VR and other presence-evoking technologies in society raises critically important ethical issues; this story is from The Intercept and includes a quote from our colleague Andrea Stevenson Won; the original version features a series of eerie animations. –Matthew]

[Image: From an animation by Scott Gelber for The Intercept]

THE DARK SIDE OF VR

Virtual Reality Allows the Most Detailed, Intimate Digital Surveillance Yet

December 23 2016
Joshua Kopstein (additional research: Jeremiah Johnson)

“WHY DO I look like Justin Timberlake?”

Facebook CEO Mark Zuckerberg was on stage wearing a virtual reality headset, feigning surprise at an expressive cartoon simulacrum that seemed to perfectly follow his every gesture.

The audience laughed. Zuckerberg was in the middle of what he described as the first live demo inside VR, manipulating his digital avatar to show off the new social features of the Rift headset from Facebook subsidiary Oculus. The venue was an Oculus developer conference convened earlier this fall in San Jose. Moments later, Zuckerberg and two Oculus employees were transported to his glass-enclosed office at Facebook, and then to his infamously sequestered home in Palo Alto. Using the Rift and its newly revealed Touch hand controllers, their avatars gestured and emoted in real time, waving to Zuckerberg’s Puli sheepdog, dynamically changing facial expressions to match their owner’s voice, and taking photos with a virtual selfie stick — to post on Facebook, of course.

The demo encapsulated Facebook’s utopian vision for social VR, first hinted at two years ago when the company acquired Oculus and its crowd-funded Rift headset for $2 billion. And just as in 2014, Zuckerberg confidently declared that VR would be “the next major computing platform,” changing the way we connect, work, and socialize.

“Avatars are going to form the foundation of your identity in VR,” said Oculus platform product manager Lauren Vegter after the demo. “This is the very first time that technology has made this level of presence possible.”

But as the tech industry continues to build VR’s social future, the very systems that enable immersive experiences are already establishing new forms of shockingly intimate surveillance. Once they are in place, researchers warn, the psychological aspects of digital embodiment — combined with the troves of data that consumer VR products can freely mine from our bodies, like head movements and facial expressions — will give corporations and governments unprecedented insight and power over our emotions and physical behavior.

VIRTUAL REALITY AS a medium is still in its infancy, but the kinds of behaviors it captures have long been a holy grail for marketers and data-monetizing companies like Facebook. Using cookies, beacons, and other ubiquitous tracking code, online advertisers already record the habits of web surfers using a wide range of metrics, from what sites they visit to how long they spend scrolling, highlighting, or hovering over certain parts of a page. Data behemoths like Google also scan emails and private chats for any information that might help “personalize” a user’s web experience — most importantly, by targeting the user with ads.

But those metrics are primitive compared to the rich portraits of physical user behavior that can be constructed using data harvested from immersive environments, using surveillance sensors and techniques that have already been controversially deployed in the real world.

“The information that current marketers can use in order to generate targeted advertising is limited to the input devices that we use: keyboard, mouse, touch screen,” says Michael Madary, a researcher at Johannes Gutenberg University who co-authored the first VR code of ethics with Thomas Metzinger earlier this year. “VR analytics offers a way to capture much more information about the interests and habits of users, information that may reveal a great deal more about what is going on in [their] minds.”

The value of collecting physiological and behavioral data is all too obvious for Silicon Valley firms like Facebook, whose data scientists in 2012 conducted an infamous study titled “Experimental evidence of massive-scale emotional contagion through social networks,” in which they secretly modified users’ news feeds to include positive or negative content and thus affected the emotional state of their posts. As one chief data scientist at an unnamed Silicon Valley company told Harvard business professor Shoshanna Zuboff: “The goal of everything we do is to change people’s actual behavior at scale. … We can capture their behaviors, identify good and bad behaviors, and develop ways to reward the good and punish the bad.”

HEAD MOVEMENTS ARE the most common metric being utilized by the handful of companies that form the still-nascent VR analytics industry. By recording a head-mounted display’s gyroscopic sensor data, commercial analytics software like SceneExplorer, created by a Vancouver-based company called CognitiveVR, is used to reconstruct “heat maps” of everywhere a user looks while in VR.

VR products like Oculus Rift are also well-positioned to be used for a burgeoning field known as emotion detection, particularly when paired with sensors used to map real-life body movements to a virtual world. Yotta Technologies, a VR company based in Baton Rouge, Louisiana, claims its platform can detect a user’s emotional state using an array of sensors mounted to a VR headset, reading microexpressions by tracking eye and muscle movements in the face. A benefit to end users is that such information can be used to give their VR avatars facial expressions that mirror their own. Speaking with Fusion, the company’s founder said that the company’s primary goal is to “unlock human emotion.”  That goal is shared by Affectiva, an MIT spin-off company that offers “emotion detection as a service,” allowing clients to mine images and video feeds from webcams for emotional data revealing how peoples’ faces subtly react to certain cues.

Eventually, consumer VR systems will be able to capture a full range of human body motion, forming what Madary and Metzinger call a “kinematic fingerprint.” Much like the experimental gait recognition capabilities of some video camera-based monitoring systems, a kinematic fingerprint could be used to uniquely identify and analyze a person based on their body movements and posture, both inside and outside VR. VR sensors like the Oculus Touch and LeapMotion also show that hand tracking and gesture control are already becoming standard features in VR, allowing users to manipulate virtual objects and gesticulate to emphasize speech.

Advertisers are particularly excited about measuring and analyzing these movements. In a September 2016 industry report by the Interactive Advertising Bureau, the director of VR at one mobile ad company claims that the collection of this data is already “shedding light on user behavior and brand engagement for our advertisers,” and predicts that consumer VR will “bring unmatched value and measuring capabilities.” While VR adoption has been relatively slow due to prohibitive hardware costs and other factors, investment in the medium dramatically increased this year, suggesting ad companies are expecting to take advantage of those physical tracking capabilities in the near-future.

FOR NOW, MUCH of the commercial third-party software that captures physical movements in VR is being ostensibly designed for VR developers, who use the data to identify which parts of their worlds are most engaging, and which parts need tweaking. This cuts down on development costs, allowing designers to easily see how users react to different elements in real time.

Similar tracking systems have been used to study the therapeutic potential of VR. A recent study by researchers at Cornell and Stanford universities found that head movements can be used to measure a person’s level of anxiety while inside a virtual classroom. In the past, researchers have also found that carefully crafted virtual environments can aid in the treatment of post-traumatic stress, depression, anxiety, and other conditions.

What’s changed is that these insights are no longer limited to a lab testing environment. With the proliferation of consumer VR products featuring a range of increasingly sophisticated sensors, a wealth of physical surveillance data can now be made available to marketers, private corporations, and perhaps inevitably, police and government agencies.

“I think it’s interesting to consider whether people will care more about having their physiological data tracked than they seem to care about having the rest of their lives online recorded,” says Andrea Stevenson Won, a PhD graduate of Stanford’s Virtual Human Interactions Lab and the lead author of the Cornell-Stanford study. “I’m pretty protective of most of my data, so I agree that I don’t want any more collected than is necessary.”

Early experiments suggest that using this data to manipulate users is well within the means of those who control virtual and augmented reality platforms. Researchers have demonstrated that people immersed in VR can be influenced in a number of ways — from causing them to make more environmentally conscious choices to affecting the results of tests for racial bias. But with no laws restricting what kinds of behavioral data consumer VR companies can collect and how it can be used, the door has been left open for more nefarious and profit-seeking applications.

For example, in a 2014 paper on the convergence of VR and online social networks like Facebook, researchers at Dublin City University propose that AI-controlled avatars could be used to “nudge” users into accepting certain ideas or views. “An avatar might respond with a smile if asked about one political or religious idea, and frown when discussing another,” the researchers write. “Artificial avatars would be all the more effective if they can access data about the user’s emotional responses via eye-tracking or emotion capture.”

SO FAR, THERE haven’t been any formal legislative proposals seeking to limit these novel forms of surveillance. One strategy privacy advocates are eyeing is to create new legislation modeled after Illinois’s Biometric Information Privacy Act, the toughest of only two existing biometric privacy laws in the U.S. and the focus of an ongoing class-action lawsuit against Facebook, Google, Snapchat, and other companies that gather face recognition data. At issue is the use of automatic “tag suggestions” and other features that run afoul of the Illinois law by extracting biometric information from users’ photos without their explicit consent.

Fred Jennings, an attorney with the digital rights-focused law firm Tor Ekeland, PC, says that if the plaintiffs in the Illinois biometric privacy case prevail, it could establish a foothold for future efforts to limit the intimate data collected by VR and AR platforms. But unlike traditional biometric data, like fingerprints and DNA, the murky legal definition of data that records a wide range of voluntary and involuntary physical movements makes it difficult to legislate what can and can’t be done with that data once it’s collected.

“The problem is it falls into this gray area in between medical data, which is pretty well litigated and protected, and communications data,” says Jennings.

Transparency would also need to be a crucial part of any regulation, he says, because in many cases it’s not clear to what extent behavioral analytics platforms have already been integrated into consumer VR products and apps. Facebook, CognitiveVR, and Yotta Technologies did not respond to inquiries about how VR analytics platforms are currently being deployed and how the data they collect might be used.

Oculus’s privacy policy specifically states that it collects “physical movements and dimensions” from users in order to “customize your experiences based on your online activities” and “market to you.” The company also claims the right to share that information with third parties including Facebook, as well as access and preserve it in order to “detect, prevent and address fraud or other illegal activity,” among other uses. Last April, the permissive language led Sen. Al Franken to send an open letter to Oculus expressing concerns about the privacy of Oculus Rift users.

An Oculus representative did not directly respond to several questions about the company’s collection and use of physical movement and behavioral data, instead referring to its May 13 response to Sen. Franken’s letter.  In that response, Oculus general counsel Jordan McCollum wrote that the company needs to collect “physical movements and dimensions” and share that data with its developers “so they can deliver experiences that better respond to peoples’ physical movements, which is a critical feature of a good VR experience.” When asked if Oculus would specifically use the data to manipulate users’ behavior or emotions, like Facebook’s emotional contagion experiment, the representative responded “no.”

“Once upon a time, advertising was a very simple thing to detect, but now that’s no longer the case,” says Jennings. “The Facebook biometric information case is a great example of just how subtle this use of user information has become, and how difficult it can be to detect potentially impermissible uses of that information.”

At the recent Oculus event in San Jose, California, Oculus’s chief scientist Michael Abrash admitted that some of these tracking technologies are not yet reliable enough to be fully integrated into consumer VR hardware. Retina tracking is especially difficult, he says, and “would require inventing a whole new type of eye-tracking technology” to be usable for real-time capture and feedback.

“The technology is changing very quickly. But I do not think that there is any technological barrier to the kinds of manipulation that raise concerns,” says Madary. “Right now I am not aware of virtual environments that change based on the data collected about each particular user. But I don’t see any reason why such a personalized dynamic virtual environment could not be developed and sold (or given) to consumers.”

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z