Autolume: Automating Live Music Visualisation

Using Neural Networks for image generation allows artists to create new means to expressing the creativity. With Autolume we created a system that allows the user to breathe live into visuals based on sonic input. The imagery starts to dance and react to an audio feed, creating an audio-visual experience for the audience. With our tool, it is possible to accompany a music performance with varying amounts of customization, depending on the artist. The visualizer was recently adapted for an installation at the Distopya sound art festival in Istanbul 2021 (https://metacreation.net/autolume-mzton) and Light-Up Kelowna in 2022 (https://metacreation.net/autolume-acedia), through a collaboration between Philippe Pasquier and Jonas Kraasch.

For the technical explanation on how our system works, please refer to: https://metacreation.net/autolume-automating-live-music-visualisation-technical-report/

We are currently looking for possible collaborators for future installations, so if you are interested in discussing future collaborations, please reach out to Jonas Kraasch.

Previous
Previous

Liminal Tones (A / Autumn Swarm, B/ Rain Dream)

Next
Next

Spire Muse: A Virtual Musical Partner for Creative Brainstorming