For a number of years I have explored reactive music visualizations and also systems in Processing that simultaneously produce interesting sound and visuals. Below are some examples. A complete list of videos of my audio-visual work can be found on my youtube channel.
Reactive Music Visualizations with Processing
The following examples use Processing to visualize music that has been composed by other means. In all three cases the music is largely algorithmically generated.
Knight on Your Other Right (December, 2020)
Lightning Bug (August, 2020)
Lobophyllia II (2019)
Combined Audio-Visual Work with Processing
The following examples were produced using Procesing to create the visuals and sound simultaneously.
Three agents move around on the screen and choose musical rectangles to play. Made entirely with Processing using a library I’m developing for MIDI handling.
Similar to the above, agents move around the screen leaving color-changing paint dabs, emitting sounds relative to their position on the screen.
Polygons mingle with each other, periodically illuminating and producing sound. Made entirely with Processing using a library I’m developing for MIDI handling.
Processing reacts to music generated by L-Systems in Python. Uses the processing library I’m developing for MIDI handling as well as a Python port of Euterpea.
Other Music Visualizations
The following are visualizations produced with Magic Music Visuals for compositions I made with other media first.
Work with synthesizers.
Manipulation of audio feedback recordings.
Score-level generation with an early version of Kulitta, rendered to sound through virtual instruments design in Euterpea.