Study shows how to make audio experience in virtual reality ‘authentic’

[A project and publication from University of York researchers focuses on how to design audio experiences in virtual reality to create more effective presence experiences. See the original version of the story for a 1:46 minute gameplay demonstration (also available on YouTube) and find more videos in the article’s supplementary material. –Matthew]

[Image: Figure 3 from Constantin Popp et al., Creating Audio Object-Focused Acoustic Environments for Room-Scale Virtual Reality, Applied Sciences (2022). DOI: 10.3390/app12147306. Screenshot showing an area in Planet Xerilia. The collision-based system is implemented in the “water” droplets, here visible as randomly rotated white small squares, which fall from the area’s ceiling. Once they collide with the floor, they disappear and make a “splash” sound.]

Study shows audio experience in virtual reality can be ‘authentic’

August 3, 2022

Researchers at the University of York have shown that creating sound in virtual reality that mimics people’s everyday interactions with the ’real’ audio-visual world has benefits over a movie-type sound experience

Experts, as part of the University of York’s XR Stories project, have proposed a new sound strategy to developing virtual reality (VR) environments, based on how people receive images and sounds in the real world, ultimately improving user experience and reducing the risk of VR motion sickness effects.

Virtual reality applications – experienced in the home environment – use a variety of techniques to convey sound; some use a ‘movie’ approach, where the sound is outside of the interactions in the VR world, others use a single object focus for music, such as a radio, and some match sound with environmental interactions, such as the sound of an object falling on the floor.

Most often, however, VR uses a mixture of audio methods – some sounds originating from an object within the world and other sounds appearing to be imagined or outside of the virtual world.

This approach comes with some issues, however, which can contribute to the feeling of being disconnected to the virtual world that has been created, or in some cases can create a motion sickness feeling, whereby the cues sent from the user’s brain to their eyes and ears don’t quite match-up.

If sound does not mirror the everyday audio experience- a bird singing should get louder when the VR user gets closer to it for example – or if music is overlaid with no apparent source or reason for the sound, it can result in an inauthentic, confusing experience.

Research at the University of York proposed a new sound design strategy based on objects that would naturally produce sound in the ‘real world’; this means the whole audio environment in VR would be exclusively built around objects that produce sound in a specific position in space, mirroring how people experience sound in real time.

Responsive

Constantin Popp, Research Associate on the XR Stories project at the University of York’s AudioLab, said: “The difference from previous methods is that we aim to apply this thinking to all sounds and music that are part of an experience, not just some parts.

“This thinking allows us to make each sound producing object interactive and responsive to the user, which in doing so improves the user’s experience. In this way we can also re-use data existing in the VR game, such as an object’s velocity, and apply it to sound.

“For example, when a user drops an object, the game would play a matching sound indicative of how fast and where the object fell. This strategy improves believability and narrative depth.”

Everyday world

The researchers added, however, that this methodology requires more computer processing power in the VR headset than exists with many current approaches, and would also increase the developmental phase of VR and the overall cost, so more work may be needed to make this process quicker and less expensive.

Professor Damain Murphy, from the University of York and Director of XR Stories, said: “We believe that in applying this method we can make VR experiences feel more ‘real’ as it increases the responsiveness of the environment to be more in line with that of our everyday world.

“Better audio-visual design also reduces the risk of the user feeling ‘spaced out’ or suffering the effects of motion sickness – a common issue for some people in VR – by allowing the eyes and ears to sync-up with the images that the brain is receiving.

“This approach to audio in VR can provide a more unified, natural, and authentic experience.”

The research is supported by the Arts and Humanities Research Council (AHRC) XR Stories project as part of Creative Industries Cluster Programme, and published in the journal of Applied Sciences [Constantin Popp et al, Creating Audio Object-Focused Acoustic Environments for Room-Scale Virtual Reality, Applied Sciences (2022). DOI: 10.3390/app12147306]

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z