OUR PROJECTS
[[bpstrwcotob]]
MASOM: Musical Agent based on Self-Organizing Maps
MASOM learns how to play music by listening to some.
Mova: Using Aesthetic Interaction to Interpret and View Movement Data
The Lab's movement visualisation tool
Walknet: Affective Movement Recognition and Generation
Machine learning models to recognizer the valence and arousal of movement information
Automatic Pure Data Patch Generation
Automatically synthesizes generation using Cartesian programing
Music composed by Genetic Algorithm
Overlapping harmonic progression, was generated using a harmonic analysis of 87 compositions.
Virtual Agent - Human Agent Interaction
Incorporating human performers within an ensemble of virtual agents.
Generating Live Notation
Generating musical notation for live performers, for interaction with robotic instruments.