A TouchDesigner experiment exploring how sound can drive motion and geometry with visual storytelling.
This project was created using audio analysis in TouchDesigner. The goal was to design a Spotify Canvas-style visual that reacts dynamically to music, and to make it controllable with keyboard inputs so that the user can randomly control the starting point and generate that animation.
The goal was to translate sound into visual motion by analyzing audio data such as amplitude and beat detection, and create several type of mouvement using shapes, noise settings, and particles. This project focuses on how shapes and transformation materials can produce a unique animation, especially when it is controlled by real-time audioreaction through audio analysis controls.
I came across these dynamic images and touchdesigner videos that completely caught my attention, while I was researching for ideas for the animation. They consist of this cool supernova-like explosions / dark portal orbit vibes, which I really liked and I wanted to recreate a similar imagery of that sequence. Since my style is often worked around dark background colors and light bright motion effects, followed by shapes mouvement, I thought that it would perfectly fit the theme I was going for. That made it so that it also made sense to the audience to the direction I was aheading toward, just based on the space type scenery. Therefore, the visual direction was pretty much leaned toward abstract transformations driven by rhythm and frequency. Conceptually, the project draws from motion design systems where sound directly impacts geometry, color, and transformation to create and give a nice cosmos atmosphere.
The project was built using a structured node system in TouchDesigner, combining multiple operators to process audio and render visuals. These operators include the use of TOP, CHOP, SOP, MAT, and POP materials, and to be precise:
The final result is an 8-second looping animation designed for Spotify Canvas format to give a brief feel of how the animation would look and how in little time, it brings dynamic to the animation.
There were different audio parameters that were changed and mapped in the operator settings into visual transformations:
These allowed the animation to feel synchronized and responsive to the music, while also allowing the user to control certain aspects of the asthethic, such as noise and amount of particles to modify mouvement and it's speed.
To better showcase motion and detail, selected frames from the animation are displayed below. From making the visual change its uniqueness to look like a potion pot, to a flower, to an hour glass, to then an cosmic galaxy portal. This makes it look like there is a new image or design appearing every few moment. All that magic is done within seconds, whether it is through keyboard inputs or the natural process of particles timer animation.
One challenge was achieving smooth and meaningful audio-driven motion without making the visuals feel too chaotic. Another difficulty was balancing the performance (if I added too much particles or went too crazy on certain settings, it would start lagging the whole animation to the point I would not be able to time the consistency of the changes aesthetic-wise) while maintaining its visual complexity.
This project helped me understand how real-time data can influence motion and how cool it turns the animation into just by messing around with different controlling options. I also learned how to use some of the brand new POP operators that are available in Touchdesigner's experimental version, which was pretty fun to discover and to implement into my project. In the future, I would love to explore more about complex systems related to feedback loops.