Evaluating Human-AI Interaction in Computer-Assisted Music Composition with Calliope

Are you interested in music-making and AI technology?

We, at the Metacreation Lab for Creative Artificial Intelligence at Simon Fraser University (SFU), are conducting a research study on creative AI tools for music composition. Today, we are releasing, testing and evaluating a new version of our Calliope web environment for computer-assisted MIDI musical pattern generation.

The system rests on MMM, our multi-track music machine model to generate, re-generate or fill-in completely new musical content based on existing ones and their instrumentation. In this way, you use existing musical content, yours or that of others, as a prompt for your generation requests. The interface offers controls for polyphony, note density and note length for each track of the music piece.

We seek to better evaluate the potential for adoption of such systems for novice/beginner as well as for seasoned composers. More specifically, you will be asked to compose a short 4-track musical composition and to fill out a survey questionnaire at the end.

There is no pre-requisite for this study beyond a basic knowledge of music software (DAW) and MIDI. So, everyone is welcome even if you do not consider yourself a composer, but are interested in participating. The entire study should take you around 2h. As per our research ethics approval, you must be 19+ years.

For any question or further inquiry, please contact researcher Renaud Bougueng Tchemeube directly at rbouguen@sfu.ca.

Watch the youtube 1-min demo: https://www.youtube.com/watch?v=pnf9VloRDg0

Try Calliope v0.11 here: https://www.metacreation.net/calliope/

Next
Next

Evaluating Human-AI Interaction in Computer-Assisted Music Composition with MMM-Cubase v2