When it comes to VR game development, we usually think of the visual element first. And it’s obviously super important! But sound is just as important to building a convincing virtual world. In fact, the wrong audio can completely ruin even the most immersive visuals.

Sounds plays such an important part of our everyday experience that we just tune it out – the sigh of air vents, the whine of computer monitors, the clacking of keyboards (can you tell I am in an office right now?)

How immersive could a VR experience be if there were no ambient sound effects to build up our world? It would feel hollow and empty, like a sound stage.

And sound is even more important for gameplay elements – like accurate placement of footsteps to signal advancing danger, or a character’s voice coming from their mouth, or even the difference of a muzzle burst coming from a gun’s barrel compared to the tinkling of spent brass on the ground. These types of placements are absolutely crucial to proper VR immersiveness!

But like everything else in VR development, sound for introduces its own unique set of challenges. So, what makes VR 3D audio different than regular 3D audio?


Head Related Transfer Function – HRTF

Head related transfer function is the phenomenon that makes sound for VR tricky. Your brain interprets the miniscule difference in volume between your two ears to clue you in on the direction of a source. In other words, your head locates a sound in the world based on the subtle differences between each ear.

Regular 3D games imitate these audio illusions by using surround sound to pipe sounds to the left or right side depending on the players location in-game. However, surround sound only works when the player is oriented in the same direction as the speakers. When a player moves their head or turns their body as VR necessitates, the outputs need to be rotated against the player’s rotation so that they appear to remain in the correct spot in 3D space.

And because the player can move their head, they can perceive audio much more accurately – in fact, the same way we perceive it in the real world. This is a problem, because it means we need to place our audio in the world much more accurately. This is a contrast to the usual way of doing things – we can place our audio in the world with much less specificity for non-VR projects.

Basically – the very first thing a player often does in VR is get really close to objects, so it’ll be easy to hear when an NPC’s footsteps are coming from their neck, or their voice is emanating from below their feet!


The solution

Sound placement is vital for keeping sound in VR feeling natural. Luckily, Unreal Engine allows you to attach components onto skeletal bones. Use this to your advantage! Let’s break down how to set that process up. We’ll attach a character’s voice track to a character’s jaw bone as an example.


Step one, create an audio cue

Create an audio cue and add an Attenuation node. The first node is a simple sound, dragged in from the content browser. The second node is an Attenuation node which allows sounds to decrease in volume the further away from them you are. The third node is the output.

Step two, the audio cue’s Attenuation node

With the Attenuation node selected, the designer can either supply a set of attenuation settings from the content browser, or use the default settings. The override switch at the bottom allows for custom settings. You’ll need to adjust this depending on your game’s world scale, testing is important here.

Step three, add notify track

To add your sound to a certain animation, right click on the Notify Track and select Add sound.

Once you’ve added the Play Sound Notify to the animation, you can inspect its details by clicking on it. The Details pane allows you to select either the actual wav file or the Cue file that was created (in most situations you’ll want the cue so that attenuations and randomizations, etc. can be applied to the output). Importantly, in specific to VR, sounds should  be attached to the things emitting them.

Check the Follow box and enter the name of a bone from the Skeleton Tree into the Attach Name. That’s how you can get specific sounds attached to specific bones, like a character’s feet shuffling should be attached to each individual foot, voices should come from the head, and gunshots should come from the barrel of the gun, not the stock.

VR audio made easy!

That’s it! UE4 makes the process pretty easy. Syncing your sound effects with your animations will really improve the immersiveness of your VR space!