Google, AP and Tribeca Film Institute fund ambitious immersive journalism project

[From Fast Company’s Co.Create, where the story includes more images and a video]

Original image from Use Of Force Protocol events

[Image: Death At The Border: Security camera footage was used to plop bystanders at the scene of Anastacio Hernández-Rojas’ death in a wraparound, fully-immersive digital world.]

Put On A Helmet, And You’re In The Story: Why Virtual Reality Journalism Is The Future

Google, the Associated Press, and the Tribeca Film Institute are funding one of the most ambitious virtual reality projects yet: A 3-D, fully immersive re-creation of the death of a migrant at the U.S.-Mexico border.

By: Neal Ungerleider

Nonny de la Peña isn’t your typical PhD candidate. The former Newsweek reporter turned University of Southern California doctoral candidate specializes in re-creating real-life news events as fully immersive virtual reality experiences. Inside de la Peña’s lab, a handful of colleagues use the Unity gaming engine and other tools to engineer 3-D reenactments of violent crimes and dramatic incidents that integrate original audio and video sources. Experiencing a movie on her customized headset is like stepping onto Star Trek‘s holodeck. It’s also a storytelling technology that Google, the Associated Press, and the Tribeca Film Institute are very interested in.

Co.Create has written about de la Peña’s work before. Her 2012 immersive film Hunger in L.A., which appeared as part of the New Frontiers program at Sundance, re-creates a real-life incident that happened at a Los Angeles food bank. A man went into a diabetic coma waiting on line, and another person on line turned on a recorder. An audio recording of the incident, integrated into the virtual reality segment, captures EMTs making fun of the victim and food bank volunteers clumsily attempting to manage the growing incident crowd.

The film, watched on a heavy virtual reality headset that takes over most of the viewer’s field of vision, leaves only a small sliver of real-life peripheral vision. Otherwise, it overwhelms the senses. While real audio of the event plays, a virtual world envelops the viewer. They can walk around the whole block, bank their head at the sky or stare down at the pavement. When viewing it, you aren’t in a dark room at USC. You’re in line at a food bank. Google, Tribeca, and the Associated Press are all generously funding an even more ambitious use of de la Peña’s talents.

The three organizations are funding her new project called Use of Force Protocol. The virtual reality experience is a five-minute-long re-creation of the death of Anastacio Hernández-Rojas, a Mexican migrant to the United States. When Hernández-Rojas resisted deportation at the U.S.-Mexico border, he was severely beaten and tasered by Border Patrol and Customs and Border Protection officers. He died of his injuries; multiple cameras caught Hernández-Rojas’ death. Use of Force is a well-funded, fully immersive re-creation of the deportation and death that stations viewers among a crowd of bystanders who were trying to prevent the assault in real life.

This is a new way of telling stories. “I work on immersive journalism in 3-D virtual worlds,” de la Peña says. “I don’t give you a body or a physical presence in the virtual world because viewers don’t have any agency to change history.” Audio which was captured at the scene of the events will come out of the mouths of the actual people saying it in many cases; de la Peña contacted witnesses in order to create accurate digital avatars for them.

One funder, Ingrid Kopp of the Tribeca Film Institute, told Co.Create that Force Protocol “pushes the boundary of story and technology.” On June 27, Tribeca gave de la Peña a $50,000 grant to develop the VR exercise; an additional $20,000 in funding came from the AP-Google Journalism and Technology Scholarship Program, which funds “innovative projects that further the ideals of digital journalism.”

Although Use of Force doesn’t allow interactees to change the circumstances of Hernández-Rojas’ passing, it does place them directly at the scene. The film wasn’t ready for testing when I tried it out, but Hunger in L.A. was a good indication of where things are going. With the bulky, heavy helmet for the film strapped on, I was inside a fully immersive virtual world. With de la Peña playing minder and holding a tether which prevented me from bumping into walls, I somehow ended up inside the news story.

It’s easy to see why Google, the AP, and Tribeca are interested in this technology–it holds obvious potential for journalism, as well as for fictional storytelling. It also raises dicey issues of objectivity and subjectivity: Creating a virtual world for participants to immerse themselves in can raise all sorts of potential for bias and story shading that standard print or video journalism can’t. Nonetheless, it’s a sneak peak into a future of journalism that’s around the corner in the future. When Google Glass develops into a successor product and Oculus Rift-like virtual reality gaming becomes commonplace, virtual reality walkthroughs will be the future of journalism. Even as a sneak preview, Use of Force is genuinely exciting.


Comments


Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News:



Archives