01 Discover
Context
This project is a creative homage to Franz Kafka's The Metamorphosis, blending generative music and reactive visuals to craft a deeply immersive, multi-sensory experience. By integrating the technologies of Processing, Pure Data, and TouchDesigner, it creates a dynamic performance that captures the essence of Kafka’s exploration of transformation and identity. Through the synergy of sound, visuals, and thematic storytelling, the performance offers an evocative exploration of Kafka's themes of change, alienation, and identity, drawing the audience into a compelling artistic interpretation of his seminal work.
Challenges
The performance integrates various synthesis techniques, including Additive, Subtractive, Frequency Modulation (FM), and Granular Synthesis, to craft dynamic and evolving soundscapes.
- Seamless Transitions: Ensuring smooth shifts between musical sections to maintain the immersive atmosphere without disrupting the flow.
- Unified Control: Managing the entire performance with a single controller, providing intuitive access to all features.
- Real-Time Audio-Visual Synchronization: Creating a cohesive experience by tightly connecting music and visuals in real-time, ensuring that every sound is visually represented.
- Strong Audio-Visual Synergy: Prioritizing a harmonious and synchronized relationship between audio and imagery to deliver an engaging and cohesive performance.
References
- Ryoji Ikeda: A renowned artist known for his minimalist and data-driven audiovisual compositions.
- "unfold" by Ryoichi Kurokawa: A performance blending visual art and sound to explore themes of space and time.
- "Drone": A piece available on YouTube, showcasing immersive audio-visual elements.
- Profiteroles by Rino Petrozziello: Another artwork available on YouTube, featuring unique artistic expression.
02 Implementation
Processing
One of the biggest challenges in this project was finding a way to control as many parameters as possible with minimal hardware—using fewer sliders, buttons, and controllers. To address this, I developed a 2D slider to act as the master controller for the entire performance.
I designed the 2D slider in Processing, focusing on simplicity and functionality. Using basic shapes and tracking mouse positions, I captured the values needed to interact with Pure Data effectively. While I won’t dive into the technical details, the core strategies involved:
- Defining boundaries for each chapter of the performance to ensure smooth transitions.
- Using mathematical operations to precisely map the values to fit the requirements of the soundscapes and visuals.
This approach allowed for intuitive, real-time control of multiple variables, enabling seamless coordination across the audio and visual elements of the performance.
OSC
Some of the libraries I was looking at were oscP5, controlP5, mrPeach, and the OSC library (found in the browser of PD). For Processing, the first two libraries were important to start sending values with OSC.
The first OSC syntax I tried, using mrPeach and routeOSC, was working but not consistently. I had to repeatedly open and close PD for it to properly read the objects. Eventually, I decided to explore other alternatives.
Later, I discovered a solution using the netreceive object, which resolved all issues with receiving OSC messages:
- Ensured PD would reliably receive messages from Processing.
- Streamlined the workflow by avoiding repetitive setup processes.
By implementing this method, I could successfully send and read messages between Processing and PD.
Pure Data
The performance is divided into four chapters, each representing a distinct phase of the metamorphosis:
- Awakening – The break from silence and sleep, signaling the start of transformation.
- Negation and Lostness – A phase of resistance and disorientation, marking the onset of change.
- Metamorphosis – The moment of no return, where transformation fully takes shape.
- Acceptance and New Reality – A return to calm, embracing the changes that define the new reality.
Each chapter is represented by its own soundscape, carefully designed to reflect its unique feeling and rhythm while maintaining continuity as part of a cohesive performance.
Creating Continuity and Differentiation
The performance revolves around four main notes, introduced sequentially in the first chapter. These notes act as the core of the soundscapes, manipulated through changes in rhythm, reverb, oscillators, and scales to provide both continuity and differentiation across chapters.
For added tension and resolution, dissonant notes were derived by shifting main note values up or down various scales. This balance was achieved through experimentation and iteration.
Techniques and Sound Design
- Additive Synthesis: Combined frequencies into chords, transitioning between major, minor, augmented, and diminished modes.
- Frequency Modulation (FM): Introduced dissonance and tension through waveform transitions, symbolizing transformation.
- Granular Synthesis: Mimicked insect sounds, aligning with the transformation phase, with dynamically adjusted grain duration.
-
Additional Techniques
- Panning and delays for spatial depth.
- Envelopes for dynamic shaping.
- Random timing for unpredictable metronome effects.
TouchDesigner
To send values from Pure Data (PD) to TouchDesigner (TD), I utilized OSC (Open Sound Control) messages. Initially, this setup worked flawlessly, allowing smooth communication between PD and TD. However, on January 1st, the OSC messages suddenly stopped being sent, despite no apparent errors.
After troubleshooting, I discovered the issue was related to the syntax and library being used. I eventually found an alternative code that defined the OSC message more clearly and used the netsend function for better stability.
03 Results
Visual and Performance Outputs
The final results showcased seamless integration of audio and visual elements, representing the metamorphosis journey effectively. Below are snapshots from the performance, highlighting key moments and technical achievements: