With several years of intensive development behind it and more than a year out in the wild, in the players' hands, EVE: Valkyrie has gone through some pretty dramatic evolutionary changes since its first experimental steps into the world of VR.
Any player who has spent even the briefest time in a virtual environment, and specifically in EVE: Valkyrie’s universe, will already know that sound is every bit as important as 3D stereoscopic visuals and motion tracking when it comes to creating a convincingly immersive experience.
We wondered how Valkyrie’s aural experience has developed over time since the project first left the drawing board. So, we spoke to CCP Newcastle Audio Lead, Jonathan McCavish AKA CCP Noise (pictured above, he's the guy on the left), to take us right back to the basics and give us the whole story on the game’s soundscape.
Hi Jonathan. Let’s start at the very beginning. We want to take you right back to day one on EVE: Valkyrie. Was the sound part of the mix right from the get go, or did it grow out of the early visuals?
Day one on Valkyrie began in the studio in Reykjavik, where a small team of individuals decided to create a tech demo of a VR space dogfighting game using Unity. The demo had a full complement of SFX which were created by Bjorn Jacobson, who at that time was an in-house sound designer at CCP in Reykjavik.
Following this, the game was ported to use the Unreal game engine and the audio was worked on by Alice Blunt (then CCP’s Audio Lead in Newcastle) and Ash Read (Current Sound designer at CCP Newcastle). When I joined the team in October 2014, we continued to further develop most of the in-game audio. At this point, the game was completely functioning in most crucial areas, so we had a stable platform on which to experiment and fine tune the content.
Is there any external inspiration for Valkyrie’s sound? Are there cinematic, or other sci-fi inspirations for what we hear in-game?
We are all huge sci-fi fans, of course, and so no doubt there are lots of influences that sneak in subconsciously. However, the goal with Valkyrie has always been to create the feeling of a dystopian world where the technology is hacked together.
The inspirations for the sound have frequently come from our use of synthesizers. When we first sat down to create a lot of the content, we had a modular synth in the studio, which we used to create sound resources which we could further develop digitally using plugins. This is something we continue to do. Using hardware synths, as well as software, is a very satisfying process and enables us to create bespoke content for the game without always going to sound libraries.
At the start of the project, were you already familiar with creating 3D sound in gaming generally and VR specifically?
As far as VR goes, not at all. We have had to find our way by testing the different 3D binaural audio plugins from Oculus, Sony and 3rd parties on different SFX within our game. A lot of attention has been paid to how sound responds when you focus on it visually, while wearing a VR headset. Some experiments have worked out well for us, others haven’t and have been dropped, so it’s been a learning process.
As far as standard 3D in-game audio design goes, yes, I’ve been working in that field since 2000, so I’m fairly well versed in that side of things.
What unique challenges did immersive VR sound throw up at those early stages?
VR poses new challenges because you’re trying to immerse the player in the game to a greater degree. The ‘listener’ – in other words, the point at which all the audio from the surrounding game is mixed down to – is located right at your head, and it moves with you when you move and turn your head. You are experiencing the audio in first person. This is very different to a standard 3D game, where the ‘listener’ will be attached to your character or vehicle, and will only move when you use the control pad to move them.
Experimenting with the binaural technology available took a little time, and the frame rate required for a VR game is much higher than a normal 3D game, so we always have to make sure our audio is even more efficient to save on resources. This mainly impacts on the number of audio channels we can use and the number of real-time digital effects we can route our audio through.
What equipment do you use to create sound for a Virtual Reality game?
Our favorite Digital Audio Workstation is Reaper because it’s so flexible and processor efficient. We use a variety of software synthesizers and, as mentioned, we also use hardware synthesizers as much as possible. Sci-fi sounds are great fun to work on, and making our own content allows us more creativity. We also record some of our own sounds in the real world and have recently started using contact microphones to record metallic resonances and so on.
We do use SFX from libraries too, but these clips always undergo a thorough sound design process to result in an original piece of content before they make it into the game.
We’ve come a long way in the past year and Valkyrie is constantly changing. How have you continued to improve and tweak the sound since the game launched over a year ago.
Regular updates for the game involve assets like new maps and new game modes. This requires new SFX, voice over and often bespoke music (some of which we outsource to our composer Rich McCoull of East Wing studios). When we’re not creating audio for the new content, we can be fixing issues and improving the way the game sounds generally. This is something we’re always doing in the form of adding details and new parameters to control the way audio reacts.
How have the new environments like Gateway, Solitude and Wormholes in recent major updates challenged you, and what solutions did you come up with?
Gateway and Solitude are both interesting maps, and the main challenge we faced was in adding new ambiance. Also, some areas now use realtime reverb, such as the central warp tunnel in Gateway and the subterranean areas in Solitude. We had to make some efficiency savings and work closely with our game engine code team to get this stuff working smoothly.
As mentioned, VR requires high frame rates, so sometimes getting the efficiency required in our audio can be tight. Wormholes is an area of the game where we don’t always know what will come next, but luckily our audio is quite flexible and only a few bespoke SFX are required for each new wormhole.
Aside from new sounds for new maps, ships and weapons, what developments in Valkyrie’s sound can we expect in future?
I would say one of the main areas we’re focusing on currently is to further personalize our ships. We are currently planning to reduce the number of ships in-game, but to create more individuality between the remaining ships, so this would extend to the way the ships’ engines sound and the SFX for the weapons and abilities of each ship.
As far as new audio VR technology goes, we’re always looking out, but there is nothing that we’re currently investigating right now. We’ll be attending Develop this year and perhaps that will shed light on some new emerging technologies.
In other areas, I think we are seeing a range of high end headphones coming out for VR. For the consumer, these kinds of products would represent a greatly improved audio experience when compared to the default kit. As it’s still an emerging technology, I’m sure we’ll see plenty of new developments for VR audio soon, both for games and for other products.
A big thanks to Jonathan for that insightful look into the often-mysterious world of in-game audio effects.
Be sure to keep your browser pointing this way as we regularly check in with our dev team to shed light on the fascinating processes that go into creating EVE: Valkyrie.
Our next foray into the developmental unknown will be a chat with Andrew Robinson on his multi-faceted role and the dark art that is narrative design, so keep an eye out for that in near future.
If you want to know more about EVE: Valkyrie’s development, a good place to start would be our previous dev focus on Wormholes.
Until next time, fly safe!