Flood Concept Art

Flood VR: Designing an Underwater Soundscape

In Articles, Projects by JohnLeave a Comment

Recently I was asked to work on Flood, a virtual reality performance project, in development by Megaverse.

In this early prototype, you explore a dark, city environment in VR, where you’ll witness a theatrical performance by real actors (captured using motion cameras and recreated in the Unity Scene) and become submerged by water as the world around you floods. It was my job to add audio and music to the project and create the sound of being submerged underwater.

Here’s what happened…

Meetup in London

Before I started working on the project, we met up at Imagination London for a day long workshop to explore ideas. This was a great way to quickly get a feel for the mood of the project and what will happen in the game. This was particularly important for Flood which, as a cinematic experience, required all kinds of audio for scripted moments that happened throughout and not just the static environmental audio in the scene, so getting a good idea of what was going to happen, and what it should sound like when it does, was vital from the outset.

Imagination London

I’ve had meetings in worse places, that’s for sure.

Building a Virtual Sound Stage

This project was complex, in that the different types of audio that required for the project were varied.

Some sounds would need to be Ambisonic, working as a backdrop to the entire scene while other 3D sounds would need to be Spatialized, to give a realistic sense of space.

Some sounds would need reverb, while some would need to sound as if they were coming from inside buildings.

Other sounds, like rain, thunder and water, would need to sound like they were coming from everywhere.

To get everything working, and working together, I needed to be able to experiment and troubleshoot, so I built a simple sound stage in Unity, loosely resembling the in-game environment, and got to work.

Virtual Sound Stage Scene in Unity

I needed a place to experiment first, so this was my virtual sound stage.

Building up the Soundscape

Underneath everything, I wanted to have one ambient layer of audio acting as a backdrop to all of the other sounds. This being a VR project, the best option for this was an Ambisonic Audio loop.

Ambisonic Audio isn’t a new technology, it’s actually been around since the 70’s. However, with the emergence of Virtual Reality and 360 video, Ambisonic Audio has become and extremely useful tool, offering realistic surround sound through binaural headphones.

As well as an Ambisonic atmosphere, we knew from the outset that we wanted to have Spatialized sounds in the scene for a high degree of realism and sense of space.

Unlike standard 3D sound, which uses attenuation, panning and doppler to create the illusion of 3D space, Spatial Audio goes a step further, using advanced technologies (such as occlusion, early and late reflections and head related transfer functions) to more accurately model sound as we hear it.

We also created natural reverb in a virtual 3D space, with defined walls and materials, using Google Resonance’s  Audio Room feature, which simulates the audio properties of a 3D area based on reflectivity, size and even different types of material.

This created an accurately modelled audio zone where sounds, such as a dog barking, would echo realistically from the distance.

For more information on how to use Ambisonic Audio and Spatial Audio in Unity, try my Getting Started Guide.

Creating Underwater Ambience

One of the standout moments of the Flood experience is being struck by a massive wave, which submerges you, and the world around you in water.

As it does, the chaos of the rain and thunder fall away, everything is still and, all of a sudden, you’re all alone underwater.

It was my job represent this with a realistic underwater effect.

Flood VR Game Concept Art

During the Flood experience, you find yourself entirely submerged by water.

To achieve this, I needed to combine a few different elements.

In its simplest version, I used a collider on the camera to detect when the player’s head was underwater. I applied a low pass filter to attenuate frequencies above 1000hz and toggled the effect with an Audio Mixer Snapshot, masking the transition with a splash sound.

The filter muffles high frequencies, in a similar way to actual water, however, a filter alone isn’t enough to create the underwater effect.

It’s surprisingly noisy underwater, so I added a deep, rumbling water loop, to be played whenever the player is submerged. Initially played loud, this fades away as the scene becomes calm and quiet, giving the player some time to move around and experience the underwater scene.

Which leads me to the next challenge… movement.

We knew early on that we’d need some sort of system for triggering smooth, fluid sounds in response to the player moving around.

To create this, I carried out some experiments which involved adding audio to an underwater cube that you could then drag around the screen.

In this basic example, I played a loop of water movement noise that is silent by default.

When the cube is moved, the volume of the audio loop is pushed up, relative to the speed of the object. Then when it’s slowing down or is still, the volume of the loop is faded out, tailing off the water movement effect.

The final system worked in a similar way, except that I needed to account for a different type of movement, one that’s particularly important to VR games.

Head rotation…

In many VR games, particularly this cinematic experience, the player doesn’t actually move around much.

In fact, ever since the emergence of VR there’s been all kinds of attempts at ‘solving the movement problem’ that range from the highly immersive, in the form of room-scale VR experiences, to software solutions, such as teleportation to hardware solutions, such as all-direction treadmills.

In my opinion, while I believe that there’s merit in working out elegant methods of moving around in a virtual space, I also think that the best VR games are those that are made for VR, and that play to its strengths, as opposed to emulating games of the past that rely, more heavily, on movement.

In many VR games, the player is often likely to do much of their exploring simply by looking around and, in Flood, this is also the case.

So to allow for this, I extended the system to additionally check, roughly, how fast the player was moving their head and increased the volume of the underwater loop as they looked around the scene.

Finally, because I had no idea exactly how sensitive it would be (as I was using a mouse to develop and test with), I added a modifier. That way it could be fine tuned when hooked up to a real VR headset later on.

Music, Sound Effects & Bringing the Pieces Together

With the core functionality in place, I finished creating all of the actual audio content required for the scene such as sound effects for lights, TV sets, objects such as swings, doors and car alarms, dogs barking in the distance and, of course, all of the different ways water can make noise.

Surprisingly, the actual audio creation took a back seat to ensuring that the audio functionality worked as planned.

I wrote music for the scene, including a short, lonely piece, that’s triggered when the player first finds themselves alone under the water and music for the rain dance, during which real actors recreate a performance in the scene.


Rain Dance

My aim with the music was to create as natural a piece as I could. I wanted the music to sound rich and full but to also be very isolating, so I used relatively few instruments, played most of the parts in directly, without a click track, and didn’t correct them afterwards.

This is not how I usually write at all. I usually prefer to work out ideas on the keyboard and then meticulously correct and control the MIDI programming afterwards. however, doing this created a piece of music that was very variable in speed and had its own natural rhythm. Ideal for setting the tone of this project.

Finally, with all of the content and systems finished and working, I added a script to all of the Audio Objects that included simple audio helper functions to play, stop and fade audio in Unity, in an effort to make it as easy as possible to drop sounds in the Scene and trigger them throughout the course of the experience.

Working on Flood was a great opportunity to familiarise myself with technologies that I’d never used before such as Ambisonic and Spatial Audio.

For more information about the Flood project, visit the Megaverse website.

by John Leonard French

Freelance composer and audio designer for games.

Concept art images ©  Megaverse

Leave a Comment