Last modified:
I believe that programming languages and artificial intelligence algorithms have a tremendous capacity to augment human creativity, and I explore this through music. Although I also sometimes compose music without the involvement of a machine, much of my recent music has focused on using the computer as a partner in the creative process rather than simply as a tool for more standard music production.
Audio-Visual Work with Haskell/Processing
For the last two years, I have been working on algorithms for pattern-based generative music and generative jazz. I have both explored EDM-like music with these approaches and experimented with jazz through my Algo-Jazz Series of compositions and visualizations.
Knight on Your Other Right is one of my most recent algo-jazz pieces. I was only responsible for the high-level structure of the piece and let my collection of algo-instrumentalists do the rest.
Lobophyllia II is an algorithmic jazz adaptation of an old composition of mine. It uses a output from Python-based interactive bossa nova program I made that was given a lead sheet for the main progressions in the original Lobophyllia. Certain melodies and bass riffs were taken from the original piece and mingled with the algorithmic components. The visualization was created with Processing.
Dot Matrix is an EDM-like algorithmic piece with a reactive visualization. Pitch contours were created using pattern-based generation algorithms that I presented in a demo at the 2018 Workshop on Functional Art, Music, Modeling, and Design (FARM). Rhythms were created separately using a stochastic system similar to those proposed by David Temperley. Both implementations were done in Haskell with Euterpea, and Kulitta features later in the piece on hand drums. The visualization, implemented in Processing, is a multi-agent system where the dancing blue/purple dots (which pulse with the high synth and bass kick) are attracted to the yellow dots (triggered by piano). Dot Matrix was performed at both Electronic Music Midwest 2018 and the FARM 2018 performance evening.
Algorithmic Compositions using Kulitta
Kulitta is a framework for algorithmic and automated composition system that I developed as the subject of my dissertation at Yale University. It (or “she”) is the main subject of my ongoing work in music and artificial intelligence. Kulitta uses a combination of generative grammars and geometric models from music theory to break down complex composition tasks into an iterative process. GitHub repository: https://github.com/donya/KulittaCompositions
Etude by Kulitta (PDF score) was composed by Kulitta and performed by me on piano. This piece illustrates Kulitta’s capacity for handling performance constraints as well as Kulitta’s stylistic blending capabilities, mixing models and rules for classical music (some of which are derived from Bach chorales) with models for jazz harmony.
Tandava is an algorithmic composition using Kulitta to generate percussion and piano. This piece was an experiment in expanding Kulitta’s capabilities in the rhythmic domain, creating complex patterns in a variety of different meters. Visuals were generated algorithmically.
Tourmaline (watch on youTube, listen on SoundCloud) – a three-movement work created with Kulitta and with digital synthesizers. Each movement utilizes different features of Kulitta. This piece was part of the Paul Hudak Memorial Symposium Listening Room at Yale University in April, 2016. Visuals were generated algorithmically.
Original Compositions
Much of my recent original musical work has focused on the use of the ROLI Seaboard Block as a more expressive keyboard instrument. I have also explored sound manipulation to create gradually evolving sonic environments and also more traditional-style scores. The following illustrate a combination of these approaches.
Phosphor was created with the ROLI Seaboard Block and is built on small motifs varied over time.
From a Dream is a sound-manipulation piece created using piano and guitar. Visuals were created algorithmically.
Fanatasy for Bottles was written for a collection of glass bottle virtual instruments I developed using the Euterpea library for Haskell. I wrote the note-level score for this manually and then rendered it to audio using the Euterpea-based virtual instruments. Visuals were created algorithmically.