Metacreation Lab Newsletter | AUGUST 2022

Philippe Pasquier Presenting at Mila Music + AI Reading Group

Friday, August 12th at 11 am PST | 2pm EST | Zoom  

Metacreaton Lab director Philippe Pasquier and PhD researcher Jeff Enns will be presenting next week at the Music + AI Reading Group hosted by Mila. The presentation will be available as a Zoom meeting. 

Mila is a community of more than 900 researchers specializing in machine learning and dedicated to scientific excellence and innovation. The institute is recognized for its expertise and significant contributions in areas such as modelling language, machine translation, object recognition and generative models.


Calliope Composition Environment  & Latest Research

Calling all music makers! We'd like to share some exciting news on one of the latest music creation tools from its creators, and.

Calliope is an interactive environment based on MMM for symbolic music generation in computer-assisted composition. Using this environment, the user can generate or regenerate symbolic music from a “seed” MIDI file by using a practical and easy-to-use graphical user interface (GUI). Through MIDI streaming, the  system can interface with your favourite DAW (Digital Audio Workstation) such as Ableton Live, allowing creators to combine the possibilities of generative composition with their preferred virtual instruments sound design environments.

The project has now entered an open beta-testing phase, and inviting music creators to try the compositional system on their own! Head to the metacreation website to learn more and register for the beta testing.

For an example of the capabilities of Calliope, listen to a piece by Philippe Pasquier and Renaud Bougueng Tchemeube for the 2022 AI Song Contest. “the synthrider” explores the genre of the synthwave and early 80’s synthesizers as revisited by a modern AI music algorithm called MMM. 

Click the image below to stream this Italo-disco fantasy of a machine on the AI Song Contest Website.

The research behind the Calliope project was also recently presented at the C&C Conference for Creativity and Cognition which took place in Venice, Italy this June. The paper is now available via the ACM Digital Library, and we invite you to learn more about it.


AIMC 2022 - Registration Open

September 13-15 | Online

Registration has opened  for the 3rd Conference on AI Music Creativity (AIMC 2022), which will be held 13-15 September, 2022. The conference features 22 accepted papers, 14 music works, and 2 workshops. Registered participants will get full access to the scientific and artistic program, as well as conference workshops and virtual social events. 

The full conference program is now available online

Registration, free but mandatory, is available here:


Autolume Live - Publication  

Jonas Kraasch & Phiippe Pasquier recently presented their latest work on the Autolume system at xCoAx, the 10th annual Conference on Computation, Communication, Aesthetics & X. Their paper is an in-depth exploration of the ways that creative artificial intelligence is increasingly used to generate static and animated visuals. 

While there are a host of systems to generate images, videos and music videos, there is a lack of real-time video synthesisers for live music performances. To address this gap, Kraasch and Pasquier propose Autolume-Live, the first GAN-based live VJing-system for controllable video generation.


MITACS Accelerate award – partnership with Kinetyx

 

We are excited to announce that the Metacreation Lab researchers will be expanding their work on motion capture and movement data thanks to a new MITACS Accelerate research award. 

The project will focus on ​​body pose estimation using Motion Capture data acquisition through a partnership with Kinetyx, a Calgary-based innovative technology firm that develops in-shoe sensor-based solutions for a broad range of sports and performance applications.


Movement Database - MoDa

On the subject of motion data and its many uses in conjunction with machine learning and AI, we invite you to check out the extensive Movement Database (MoDa), led by transdisciplinary artist and scholar Shannon Cyukendall, and AI Researcher Omid Alemi. 

Spanning a wide range of categories such as dance, affect-expressive movements, gestures, eye movements, and more, this database offers a wealth of experiments and captured data available in a variety of formats.