#066 – The whole ball of wax (static)
We want it all, right here, right now, the whole matter.
All the elements, the entire affair, the whole shabang.
Bring it on, all the way, let’s run the whole shooting match!
Get us you largest, your richest, your fattest, the whole enchilada.
We’re done with small, no more daunted to deal with the whole ball of wax!
Quad Damage
–https://youtu.be/JnqF2HMcYI0
#066 – The whole ball of wax (Panimating)
So let me tell you about the dream I had, while shaking my ear wax with some fresh tune straight from Burning Man 2016.
For many months spent massaging computer code and visual assets on screen, with an urge to produce the the novelty, the unexpected, the bizarre even sometimes, I have made my way across many software platforms, trying to get my boots out the muck of static rendering. In 2D, 3D and even fractal dimensions, I have been feverishly coding, modelling, texturing, filtering, lighting and finally rendering on a flat screen pretty yet desperately static images.
I have been craving for more, secretly hoping to unlock the power of realtime content generation and animation. And to be fair, I have met mixed successes with amazing tools such as Processing, Unity3D, Vuo, Particle Designer, and a few other . However, I inevitably kept hitting the glass ceiling , ending up rendering JPEG images and MP4 pre-rendered animations. Only too rarely have I been able to claim that I created one of my “Panimatings” out of pure code.
And I had this dream, of a software that could allow me to generate visual and audio content in a nodal and procedural manner, and to orchestrate and composite it at will. A tool that would allow me to code shaders, layer them with different blend modes, parameterise them to react to various forms of inputs, such as an audio signal, or simply a mouse gesture or a key stroke. If only I could code it myself, this platform I could apply to any content any sort of distortion or stylisation filter, and doing it so in real time, with very lean coding.
Well, as I woke up this morning, and started to bounce from one link to another in my browser, I think I stumbled upon it, the holy grail of real time visual animation. It actually came up as a twofold surprise.
The first is simply named Magic, and it is a procedural visualisation tool primarily meant to be used for musical performances. One of the great thing with Magic is that you can easily throw on screen OpenGL Shaders, and it comes with a collection of about 400, all sourced from the great community site GLSLSandbox.
Cherry on the cake, all these shaders can be parameterised to react to audio inputs, in a large variety of manners. The result is staggering OpenGL real time animations, in HD and 60fps if your Mac can take it.
The second takes us one step further, into the celestial plane of real time visual compositing, and that’s what CoGe is meant for. As it defines itself, CoGe is “a powerful, and extendable professional VJ software designed for realtime HD video mixing and compositing with a modular user interface – exclusively for Mac OSX.”
On Mac, both software implement the Syphon standard. Syphon is an open source Mac OS X technology that allows applications to share frames – full frame rate video or stills – with one another in realtime. And I discovered by the way, to my good surprise, that other software Syphon compatible include DiscoBrick (Another audio visualisation tool), Vuo, and also through plugins Quartz Composer, Cinema4D, Unity3D and Processing.
Using Magic and CoGe together is just opening mind blowing perspectives! If you get your audio sources right, thanks to Soundflower or Boom for instance on Mac (I am using the latter), then you can pick a great long track on Soundcloud our Mixcloud, and get started for a good visualisation session.
Sound track by YokoO @ Burning Man 2015 – White Ocean
The GLSL shader used