Last month, their skills were put to an even bigger test when digital creative fest Onedotzero approached the duo to bring their audio-responsive visuals not just to one screen, but several, and not just to plain old music, but to multidimensional sound, as part of a 3D Soundclash sponsored by Red Bull Music Academy at London's Royal Albert Hall, which pit musicians from Warp records and Ninja Tune in a musical face off.
Multidimensional sound, you ask? Well, in plainspeak, that would be tunes that don't just flood the room as they would in your average concert, but whose various channels could be directed at precise locations, via a multi-component and highly targeted sound setup—in this case, Illustrious Company's 3D Soundsystem.
Minivegas' Luc Schurgers and Daniel Lewis provided us with the above video explaining the setup and elaborated on how the project required them to ramp up their reactive visual tech in the interview below.
Tell us how you got involved in this project, and what were you tasked to do?
The project came to us via Onedotzero Industries. They had commissioned the S4C audio-reactive idents we did, and we'd also done some really fun custom visuals for a festival they put on in Taipei a few years ago. We'd just signed with Nexus Productions in London, who did a fantastic job producing the visual content and also housed our team's "workshop" (a formidable wall of six monitors hooked up to a very noisy Mac).
The project brief, initially explained on a napkin over a few beers, contained some exciting names like Red Bull Music Academy, Warp and Ninja Tune, the mysterious term "3D Sound System" and a rather frightening deadline. We were tasked with developing an immersive visual experience that would complement the nature of the event: a "3D Soundclash" between artists from the two labels.
As well as the awesome roster of artists playing at the event, the main thing that piqued our interest was this notion of a "3D" soundclash. Illustrious Company has been developing this technology for the last 10 years, though I think the original ambisonics research was first done in the 1970s.
We discussed using our old software, but Onedotzero were very keen on using multiple screens— something we hadn't tried before. They also wanted to heavily emphasize the 3D sound element, to do something unique with that. We took this away and decided that we would aim to put our visual content in the same physical space as the sounds, to form a strong bond between the two and the audience.
The idea was to use a ring of five projection screens around the walls of the club, projecting out onto a visual landscape outside the room, reacting to the sound events which were being projected into the room. We aimed to match the rendering of each screen so that if you stood at the center of the room (the audio "sweet spot") it would be just like looking through windows through to the world beyond.How did your previous projects help/inform what you did?
Our previous work on the "rDog" and "Visualizer" projects gave us the notion of how reactive we'd like things to be. Since those, which were mainly line based, graphics cards have moved on and we were able to use a lot more effects - real-time lighting and better physics. The difference is that we now had five times as many frames to render per second!Tell us about Illustrious Company's 3D Soundsystem—what role did that play in the event and how did it relate to your role?
Illustrious Company developed the "3D Sounclash" concept initially with Red Bull Music Academy. Their setup uses two rings of speakers—eight high and eight low—and some unique signal processing software to render incoming audio channels to different positions. For instance, they can put the bass in the center of the room on the floor, then spin multiple vocal layers around your head (though apparently if they move the bass too fast people tend to get sick!)
Their role was to create the 3D audio setup for the event, and collaborate with the artists to best use that. The Warp and Ninja Tune artists are pretty clued up when it comes to technology, and certainly on the night we saw some artists sending lots of crazy effects spinning around the room.Did you guys collaborate with IC at all?
Yes, they are really great guys and we spent a fair bit of time going back and forth to their demo studio in Brixton. After some explanation on how their software works, and some development, we managed to connect our computers and from then on it was a case of making sure that we saw the sound appear where they were projecting it. We built some rough pre-viz tools to check this—cue lots of colored pulsing dots floating around a mock-up if the space—to make sure that this was calibrated to the physical layout of the room, so the visual content would match up.You worked with designers Quayola and Thomas Traum, right? What did that collaboration involve
We know both from way back; they're very talented guys. Onedotzero had been talking to Quayola recently and were very keen to have him onboard. We've worked with Thomas Traum several times before and he came on board after a rather excited pre-project dinner we all had. Thomas also introduced us to Field who we brought on to help with programming.
The process was iterative: Qualoya and Thomas did designs and we would try to implement them best we could and then get feedback. Despite the hectic schedule it was a nice way to work. The Field guys built a tool to help the designers edit certain aspects of the visuals—textures, shaders etc., and that became very useful towards the end.What were the biggest technical hurdles involved in this project?
Mainly timing, and integrating with the other aspects of the project. In this respect, IC were great; we met with them early on and they were very helpful with all the specs of their software and protocols. I imagine things could have been quite different if we'd left this all until later or had been working with different people.
There was a certain amount of R&D involved—working out how to render the projection screens with the correct camera parameters, interfacing with the audio system, implementing the designer's looks of course, and working out which kit to best render five outputs from.
We also required detailed measurements of the space, so keeping on top of that was important.Overall, what were you trying to accomplish with this installation? What did you learn from it,and what would you have done differently if you were to do again?
We were aiming to achieve a fusion between the spatial projection of the audio, and the visual content which was been gernerated from it. We wanted people to feel a closer connection between what they heard and what they saw than they had seen before.
We'd love to try the same set up again, perhaps in a different context than a full-on club night, with a lot more time to set up. For instance, an installation with sound-designed 3D audio, and some live elements, as an alternative to the completely live and freeform nature of the night with multiple DJs having completely different styles (though of course, this is one thing that made the event so exciting). We'd also probably tweak the kit slightly—in fact we've met plenty of good people along the way who have given us advice we'd have been happy to have at the beginning. We've certainly learnt a lot!
Also, this Saturday we're collaborating with Quayola to perform audio-reactive visuals for two nights that the are putting on at the Roundhouse in London. There'll be no 3D sound unfortunately but it will be a nice coda to all the hard work and collaboration on this project. Looking forward to sleeping in late on Sunday!