/ GETTING STARTED

Clastic Music

What is Clastic Music?

Clastic Music is a set of musical pieces where I play with the geological concept of the clast: new entities and forms created out of pre-existing fragments, akin to how machine creativity uses pre-existing data to generate new forms. In Clastic music, the fragmentary entities are rhythmic patterns in simple and compound meter—common in contemporary music genres.

Clastic music is the result of rhythmic improvisations using a machine learning-assisted compositional approach. To create these rhythmic clasts, I use a custom-built software device based on a variational autoencoder. The pre-existing fragments, the training data, consist of datasets of rhythms in MIDI format, collected and assembled from post-dubstep, gqom, and footwork tracks.

During live performance contexts, I explore and play with the models by means of mapping a two-dimensional performance space to the latent space, so that rhythmic patterns are retrieved, decoded, and interpolated by means of using an imaginary playback head in the latent space.

I provide links to videos of two clastic pieces made during a live performance at the International Festival of Digital Creativity and Electronic Music MUTEK 2020, which was held in Montréal from September 8 to 13, 2020. On the first piece, I play directly with the latent space, retrieving rhythmic patterns from it, and on the second one, I improvise on top of previous latent space explorations, resulting in building a clast on top of another clast.

The online gallery for the NeurIPS Machine Learning for Creativity and Design workshop could exhibit these videos and/or a web-based visualizer that creates a dynamic representation of the rhythmic latent spaces that I used during the performance. Visitors of the gallery could explore latent spaces directly in their browsers and visualize the rhythmic pulses of the clastic space.

Here are a few examples of Clastic Music performed at MUTEK2020 (Montréal, 11 September 2020)

E01

In this piece, I explore a rhythmic latent space modeled using R-VAE, and learned from on a small dataset of 12 MIDI clips of footwork. I am using a KaossPad controller as a gestural interface. The XY position in this performance space is mapped into the XY position in the latent space of R-VAE and a rhythmic pattern is retrieved and decoded. R-VAE outputs MIDI messages. For E01, I’m using AAS Chromaphone 2 as a physical modeling sound engine. The whole piece is totally improvised.

E02