Minivegas Walks Through the Visualizer

Published on .

Directing collective Minivegas recently developed a virtual gallery that creates art in real time response to music and gestures. MP3s uploaded into the gallery drive the growth and formation of digital sculptures, and gestures through the webcam can change viewers' perspective of the space.

Minivegas' Luc Schurgers explains how the Visualizer works here.

And here he takes us further behind the work:

Tell us about the history of the visualizer. Why did you create it? How long did it take to develop?
We've been mocking about with sound visualizations for a while and this is kind of a show off piece, to be honest. We wanted to show what could be done in realtime on a two-year-old laptop and it was good to make something finished and polished. When we were doing s4c we wanted to have some of the spots with a moving camera as we found out how to do it after we completed the first three spots successfully as a locked of. The client felt that it broke the consistency of the campaign which was totally true of course. We really wanted to apply this idea to a different project though, but it was pretty hard to explain this idea without showing it to other clients. We thought we'd just do it. The piece took about two months to make, but the project was spread over quite a while as we couldn't work on this full time unfortunately.

You developed this using the technology that helped you to create the S4C campaign. Can you tell us about what was involved with R&D? What technologies did you develop for S4C and what new ones did you create for the visualizer?
Just like S4C, the program uses a fast fourier transform (FFT) to analyze the frequency spectrum of the audio--either audio files or line-in. The audio analysis is then used to influence the behavior of the graphics. In S4C we used it to drive realtime composites and playback of live action plates, but for this project it's actually driving 3D geometry. For this project we created realtime meta-balls, random polygon clipping, particle systems, physics simulations and random growing geometry. The program is build for the NVIDIA GeForce 8 and 9 series graphics chipsets, and uses advanced OpenGL 2.1 features to achieve graphical effects such as realtime depth-texture based shadow mapping, realtime cubic environment mapping and realtime Phong-like shading using GPU vertex and pixel shader programs.

Overall, what were some of your biggest challenges on the Visualizer?
The tricky shit is to make things move fast. When you're doing realtime stuff you don't have time to render or pre-calculate as everything needs to be calculated instantly. Although the program could be optimized much further--it's really a prototype, many of the graphics modes already achieve a frame rate of 60 frames per second even when running on a laptop! So that's pretty fucking fast.

Achieving this speed was helped by off-loading a lot of the work from the CPU to the graphics GPU, making use of dual-core CPUs by running the computer vision code and the graphics code on different CPU cores and using texture compression to achieve smooth playback of 1k blackplates from a laptop hard drive.

What sorts of applications do you foresee for this? What's next in the line for you guys to develop?
Options are endless to be honest. We can take this to broadcast, as we've developed all the required technology before, or in-store, mobile or web of course. As it's custom code it can be applied to pretty much any media. What we really want to do is take this a step further and take it into an augmented reality environment. It would be pretty awesome to have realtime camera tracking around these sculptures. Or if we're talking broadcast it would be cool to do something on a moco path! This shouldn't be too hard to do as all the augmented reality libraries are available.

I guess it's just finding the right client. Another thing we'd love to do is to take the engine designed for this and or any other game engine and put this on top of a live action shoot. This would be supercool for a game commercial as you'd be fusing realtime games and their engine with live action plates.

What are you guys working on now?
We're currently working on a bunch of spots for the nice chaps at Goodness, but they are straight live action spots with a shitload of CG. Which is great fun to do too. Interactive-wise we're working on some cool art stuff and we're about to finish our first iPhone app and website.

Are you going to make it available for the general public to use?
Yes we would like to, but since it is quite a hassle to support public use due to hardware specifications and various operating systems, it's better to wait till we find the right commercial client to team up with as this takes quite a while to develop.

Anything else interesting about the project you want to add?
We're showing our interactive work in the gallery in London soon so if you're around drop by.

See how the tech worked on S4C, an interactive ident campaign for Welsh broadcaster S4c. The actual footage of the idents would change in response to the voices of the broadcast announcers.
In this article:
Most Popular