Brutal delusions

June of 2023

Porto, Portugal

Overview

In a collaborative project with a team of three, I contributed to the creation of a 10-minute visually reactive performance using TouchDesigner. Drawing inspiration from live performances with reactive visuals, we designed an experience that blended dark, emotional music with evolving visuals to evoke nostalgia and mystery. Over six weeks, I focused on developing the reactive visuals, ensuring they seamlessly synchronized with the atmospheric soundscape to create a dreamy and immersive atmosphere.

Project Goals

The project aims to deliver an engaging performance by integrating music and visuals seamlessly, providing a captivating experience for the audience.

Warning Flashing lights

01 Discover

Context

This project is an immersive VJ performance that combines gloomy, dreamy music with visually fitting, dynamic visuals. The performance aims to evoke a sense of nostalgia and obscurity, using black and white, sharp, and architectural photographs paired with trippy, distorted visuals. The overall aesthetic is built around creating a dreamlike atmosphere with visuals that react to the rhythm of snares, kicks, and varying frequencies in the music.

The performance involves seamless integration of multiple mediums and technologies: computer, projector, speakers, and mixing software like Virtual DJ or Mixxx to control the audio, while TouchDesigner is used to generate real-time visuals. The visuals evolve to become more colorful, liquid-like, and bright, complementing the shifting mood of the music. The visuals interact with the audience, becoming more distorted and trippy as the performance progresses. The goal is to create a cohesive, immersive experience where both the sound and visuals work together to enhance the emotional depth of the performance.

Challenges

Several challenges arose during the project:

  • Audio-Visual Synchronization: Maintaining real-time synchronization between the audio and visuals, ensuring that each visual reacts accurately to the correct frequency, beat, or movement.
  • Complexity of Interactivity: Balancing the interactive nature of visuals (reacting to both audio and motion) with the need for seamless transitions between different visual states.
  • Camera Motion Detection: Accurately detecting camera input and mapping it to the correct visual reactions, such as the movement of squares or the generation of white blocks around a person’s silhouette.
  • Dynamic Control Management: Managing multiple controls (sliders, main switch, alpha fade, etc.) effectively during a live performance, ensuring smooth transitions and intuitive operation.
  • Network Latency for OSC Communication: Minimizing the potential lag or delay in OSC communication between the mobile app and TouchDesigner, ensuring real-time responsiveness to user input.

02 Research

Research

For the visual and auditory inspiration for the performance, I turned to several sources that matched the aesthetic and mood I wanted to evoke in the piece. Some of the most important references include:

Visual References

  • "Aphex Twin Live Visuals" by Weirdcore – A highly influential visual artist whose work has shaped my approach to creating sharp, architectural, and black-and-white visuals. Watch here
  • Sharp, Architectural, Black & White Aesthetic – An aesthetic that I wanted to incorporate in my visuals. The combination of bold contrasts and geometric forms enhances the feeling of surrealism. Watch here

Music References

The music was a major influence in setting the tone of the project, and I curated a playlist of tracks that captured the desired vibe of dreamy, melancholy, and atmospheric sounds:

  1. Molchat Doma – Я Не Коммунист
  2. Molchat Doma – Sudno
  3. Молчат Дома– Volny
  4. Mareux - The Perfect Girl
  5. Boy Harsher - Give Me a Reason
  6. New Order - Blue Monday
  7. Molchat Doma – Sudno (repeat)

03 Results

Visual Reactions to Audio

  1. First Visual (Building Reacts to Overall Sound Input): The building’s form reacts to the frequency spectrum, driving a fluid, immersive visual transformation.
  2. Second Visual (White Squares Reacting to Camera Input): White squares react to camera input, moving based on detected motion and synchronizing with low frequencies.
  3. Third Visual (Big Black and White Cube): The cube rotates and zooms based on audience movement and music rhythm.
  4. Fourth Visual (Black and White People – Movement and Feedback): White blocks grow out of a person’s silhouette, creating continuous interaction.
  5. Fifth Visual (Black and White Bright Blur): The blur effect pulses in sync with high-frequency beats.
  6. Sixth Visual (Grains and Feathers – Kick and Rhythm Snares): Grainy textures react to kick drums and snares, adding an organic, rhythmic visual texture.
  7. Seventh Visual (Colorful Liquid – Kick, Snares, Medium Frequencies): Liquid-like visuals respond to kick drums, snares, and medium frequencies, creating fluid, colorful animations.

Controls within TouchDesigner

  1. Offset (1st Visual): Controls waveform offset, affecting visual evolution.
  2. Scale (1st Visual): Adjusts line thickness in the building structure.
  3. Light Size (5th Visual): Controls the size of the rhythmic blur effect.
  4. Grains (6th Visual): Adjusts the graininess in the feather visual based on rhythm.
  5. Liquid Size (7th Visual): Alters the size of liquid waves for dynamic movement.
  6. Main Switch: Toggles between visual effects, enabling overlapping visuals.
  7. Alpha: Fade-in and fade-out slider for smooth visual transitions.

OSC Controls via Mobile Device

The mobile app, OSC Controller for Android, is used to control the second visual (white squares) in real-time:

  1. Slider 1 (Resolution Control): Controls the number of squares, enhancing visual detail.
  2. Slider 2 (Interaction Control): Alters the interaction between the squares and the subject’s movement or silhouette mapping.