Skip to main content
Back to top

The first official tech demo for Unreal Engine 5 has already racked up 13.25 million views since debuting on YouTube in May. The impressive clip has excited video game fans around the world, offering a first look at the technological leaps and bounds they can expect from the next generation of titles powered by Epic Games’ popular game engine.

One of the new features showcased in the widely viewed tech demo, the ability to natively render ambisonic soundfields in-engine, was developed by none other than Epic Games audio programmer and 2019 BS in Computer Science and Digital Audio graduate Max Hayes. The feature was released in beta to the public just a week prior to the tech demo as part of the new Unreal Engine 4.25 update.

So, what is an ambisonics soundfield, and why does the ability to render them in Unreal matter? You can think of ambisonic soundfields as a more dynamic, flexible way to create the sensation of surround sound — one that doesn’t rely on the user owning any specific speaker arrangement to do so. “Ambisonics is a way to represent 3D audio which is decoupled from the end user’s speaker setup. They could be using headphones, or a 5.1 surround system — ambisonics doesn’t really care,” Hayes explained in a “State of Audio in 4.25” video released on the Unreal YouTube channel in April. “It does this by representing audio as a soundfield instead of storing speaker channel information. So that means that ambisonics is trying to take a sphere of space around the user and represent all the sound wave and air pressure activity that would be going on in that sphere of audio.”

In the Unreal Engine 5 tech demo, ambisonic soundfield rendering is showcased by the enveloping sound of rocks falling down the walls of a crumbling cavern as the player traverses a narrow pathway. As Unreal plays back the ambisonic audio, which can be recorded with a special microphone or authored in sound design software, it also dynamically rotates and transforms the soundfield’s position based on the player’s orientation in the world. For players moving through a level, the effect makes it feel like ambient sounds are truly coming from fixed directions in the world itself, creating a greater sense of immersion than static surround sound. In prior versions of Unreal, developers could approximate the effect in a few ways, but none were as dynamic, CPU efficient, or easy to execute as in-engine ambisonic soundfield rendering.

DigiPen graduate Max Hayes smiles for the camera in DigiPen’s audio lab.
DigiPen graduate and Epic Games audio programmer Max Hayes.

As part of his contribution to the new feature, Hayes wrote the Unreal decoder that translates the soundfield to any speaker setup, as well as the underlying ambisonics math library that makes the effect possible — something he says he was well-prepared for thanks to DigiPen.

“There was a project we did in [math professor Matt] Klassen’s class that was focused on ambisonic representation,” Hayes says. The graphics-only project used ambisonics math to visualize and control “spherical functions and shapes” representing sound pressure in specific directions at specific instances. “Flash forward five months and I had just been converted to full-time at Epic,” Hayes says. “We had a company-wide two-week break and I went back to that DigiPen assignment. I made it actually play audio and act as a 3D RMS meter. An RMS meter gives an idea of how loud a sound is, like those bouncing bars on a boombox. So my second version of the project acted as those bouncing bars but used ambisonics to show loudness as a function of direction again.”

While neither of those DigiPen projects featured soundfield decoding, the experience left him uniquely positioned to tackle the challenge when Epic presented it. “Visualizing soundfield representation with these DigiPen projects got me comfortable enough with the math and implementation that I had the opportunity to take on its implementation for Unreal Engine,” Hayes says.

Ambisonics is just the latest contribution Hayes has made to Unreal since joining Epic full-time, having already developed a wave-table synthesizer for the engine that debuted last summer. In fact, it was all thanks to another project he started at DigiPen — a custom audio engine he developed for Unreal — that Hayes first caught the company’s attention and landed a student internship during his senior year. “It has been an absolute blast working at Epic!” Hayes says. “Our team size has exploded over the last year. I was the fourth of now eight audio programmers, the newest being Anna Lantz, who is also from DigiPen! I feel very fortunate to be where I feel some very exciting things are in the pipe for game audio, right out of school.”