exploring generative sounds through tangible interfaces

This project explores the creation of generative sounds with tangible objects as interfaces developed within the context of audio-visual art installations started initially with MAGENTA (for AI generated audio) and focused on controllers that communicated with TouchDesigner. For this to be a real-time responsive system, the focus shifted to TouchDesigner and MAX/MSP. For an installation that works with tangible objects classified based on colour, Color tracking is the technique employed as a method of identifying and tracking objects via a webcam. Colour tracking together with OSC messaging then became the means to achieve real-time communication between TouchDesigner and MAX/MSP. This helps in creating an engaging experience for the participants, and the learning can then be applied to installations; for creating a cohesive audio-visual art experience.

An Integrated Approach using TD and Max/MSP

The participant interacts with the installation by placing the objects on the table, based on the colour of objects placed the corresponding sounds are played. Moving the objects on the table, the participant also controls the parameter metro (output can be quantized using tempo-relative syntax) of these generative sounds, creating a sense of play as they move these objects. The participant’s interaction with these objects results in a generative soundscape and a corresponding visual. Since each colour corresponds to a particular sound, the participants also compose the soundscape. By placing the objects individually, the participant understands the sound each object creates. By engaging in moving these objects around, they then control the frequency at which each of these sounds will be repeated in each cycle. By adding and removing objects, the participant can create a desired soundscape.  
 
The colour tracking system developed in TouchDesigner helps in isolating each object. The approximate center coordinates i.e., x and y are calculated for each of these objects on TouchDesigner and these values are sent to Max via OSC messaging. These values are mapped to correspond a colour to a specific sound(s). The new colour tracking system is more accurate and effective as compared to blob tracking and basic threshold mapping. As an experiment, 3 different sounds were added to one colour ‘green’ and the x, y values were mapped differently for each of these 3 sounds to create interesting overlays. The corresponding visuals also aid in creating an engaging experience; where the camera feed is treated through a particle system, to create motion and animation to the composition created on the table. The y value of green is also used as a real-time trigger to control the simple harmonic motion of the particles in the generated visual output. This helped in creating a dynamic output and a more participatory experience for the composer. 

year

2023

tools

Max/ MSP + TouchDesigner + OSC Messaging + Webcamera

.say hello

I'm likely out for a walk or watching dogs at the park. Reach out if you’d like to chat (or need a caffeine fix in Toronto!)

.say hello

I'm likely out for a walk or watching dogs at the park. Reach out if you’d like to chat (or need a caffeine fix in Toronto!)

.say hello

I'm likely out for a walk or watching dogs at the park. Reach out if you’d like to chat (or need a caffeine fix in Toronto!)

.say hello

I'm likely out for a walk or watching dogs at the park. Reach out if you’d like to chat (or need a caffeine fix in Toronto!)