It can be found in every cinema, in front of every television and in every Dolby Surround system: the sweet spot. This is what scientists call the place in the room where the sound is at its best. The bass is gorgeously deep, the high and mid-range tones are perfectly adjusted, and the instruments are all coming from exactly the right direction they should be. But then something happens that can ruin the entire listening experience: we move. With this movement, the sound deteriorates until it seems that all the music is just coming from the speaker closest to us.
This is precisely the problem that scientists have been researching for decades, and they have found several solutions in their laboratories to achieve a better sound experience. The problem is that under real conditions, the use of modern technologies has usually been almost impossible; the setup was too complex, for instance in a home theater. Yet in the past several years, big tech companies and researchers, including some in Germany, have made great strides towards solutions and today can project concert experiences into your living room that always sound amazing—no matter where you sit. This opens up new application areas for home theaters, headphones and even beyond the entertainment sector, for instance when parking a car.
This revolution began several decades ago, when researchers started looking for alternatives to the stereo format. The problem with this format is the sweet spot, which occurs because sound engineers store sound waves digitally on a channel-by-channel basis when mixing music. The different channels are broadcast from different speakers, and where the wave fronts meet is where the sound has the perfect balance for the listener. To create the perfect results at home, listeners would have to have the exact same set up in their living rooms as the sound engineer had, which is nearly impossible. This problem can be solved by wave field synthesis, which says that it is possible to simulate sound sources in the middle of a space by superimposing artificial sound waves from a large number of different sources such as loudspeakers. A clever algorithm calculates exactly where these waves meet and what sound they produce. The result: the tones of a violin always sound as if they are coming from the front right side, no matter where you’re standing in the space, and the further you move away from it, the quieter it gets. “We have a paradigm shift here: we create object-based sound in such a way that it seems as if we’re in the middle of the stage,” says Head of Virtual Acoustics Christoph Sladeczek at the Fraunhofer Institute for Digital Media Technology.
This paradigm shift has triggered a revolution, albeit in fits and starts at the moment. A real wave field requires hundreds of loudspeakers, which isn’t feasible outside of laboratories. In recent years, however, scientists have managed to create a virtual wave field by using algorithms and existing speakers to create an experience much like an actual wave field synthesis. In large venues such as the Zurich Opera House, audiences can now listen to concerts as if they were sitting directly onstage.
People are also sensing these changes on a much smaller scale. Since increasing numbers of music pieces, films and television series are now being recorded in an object-based way, it is now worthwhile for companies to jump on this bandwagon and promise listeners a 3D aural experience. Dozens of startups and large tech companies are working on this and a number of products have already hit the market. For instance, Apple is using these recordings for its Spatial Sound. When this setting is turned on in headphones, the source of a sound always seems to remain in the same place.. As the listener slowly turns, it first sounds as if the violin is coming from the front right, then from next to them, and then from behind them. Headphones don’t use wavefield synthesis to achieve this, but binaural synthesis. This is based on the idea that our ears can assign directions to certain sound frequencies. When a set of headphones imitates this sound, it sounds to us as if an object is either in front of, next to, or behind us. As Sladeczek says, “What we’re looking at is a completely new way to experience music and audio.”
Sladeczek also believes there are many potential fields of application well beyond just the entertainment industry. For instance, cars are being equipped with increasing numbers of speakers to help the acoustics channel play a more important role in a vehicle that is already flooded with visual inputs. In the future, the 3D experience could assist with parking. Instead of just beeping when a car is backing up towards an obstacle, a much more sophisticated system could show people how the relation of the object to the car changes depending on how they turn the steering wheel or step on the brake. “It would change a lot of things,” Sladeczek says.