GrooveNet is a collaborative project between Omid Alemi, Jules Françoise, and Philippe Pasquier. The goal is to teach a neural network to generate beat-synchronous dance movements for a given song and match movement patterns with the corresponding musical patterns. Through this project, a database of synchronized groove moves and songs was created by the researchers for the machine learning algorithm training data.

Rather than a supervised approach to machine learning training, the researchers approach this as an unsupervised learning problem. For each song, they extract the audio descriptors and train a multi-modal neural network for both the audio descriptors and joint rotations.

The initial design of the GrooveNet system

Some of the results of this research can be seen below, with examples of original dance moves created in real-time by the Machine Learning Model in response to the piece of music being played.

This rendered example shows the output of the system to music by monobor.

 

This demonstration shows the GrooveNet system in action, responding in real-time to a selection of different dance music tracks.

More technical details on the models used, training data, and 3D demonstrations can be found on Omid Alemi's webpage.

References

Omid Alemi, Jules Françoise, and Philippe Pasquier. "GrooveNet: Real-Time Music-Driven Dance Movement Generation using Artificial Neural Networks". Accepted to the Workshop on Machine Learning for Creativity, 23rd ACM SIGKDD Conference on Knowledge Discovery and Data Mining. Halifax, Nova Scotia - Canada. 2017. PDF.

Previous
Previous

PreGLAM - The Predictive, Gameplay-based Layered Affect Model

Next
Next

MetaMIDI Dataset