MetaMIDI Dataset

We anticipate that this collection of data will be of great use to music information retrieval (MIR) researchers addressing a variety of research topics!

At the recent 22nd International Society for Music Information Retrieval Conference, Jeff Ens & Philippe Pasquier introduced the MetaMIDI Dataset (MMD), a large-scale collection of 436,631 MIDI files and metadata. MMD contains artist and title metadata for 221,504 MIDI files, and genre metadata for 143,868 MIDI files, collected through a web-scraping process. 

MIDI is a technical standard and communications protocol for a wide variety of electronic musical instruments, computers, and related audio devices for playing, editing, and recording music. MIDI files contain data that specify instructions for music, including a note's notation, pitch, velocity (heard as the loudness or softness of volume), vibrato, panning to the right or left in stereo and clock signals (which set tempo).

MIDI files in MMD were matched against a collection of 32,000,000 30- second audio clips retrieved from Spotify, resulting in over 10,796,557 audio-MIDI matches. In addition, we linked 600,142 Spotify tracks with 1,094,901 MusicBrainz recordings to produce a set of 168,032 MIDI files that are matched to the MusicBrainz database. The database also provides a set of 53,496 MIDI files using audio-MIDI matches where the derived metadata on Spotify is a fuzzy match to the web-scraped metadata.

The distribution of genres in the MetaMIDI dataset for matched MIDI files using two methods: audio and audio + text.

Access to the MetaMIIDI dataset is available through Zenodo.

Users will only be granted access to the files in the MetaMIDI Dataset for research purposes (specifically for data mining or machine learning). Prospective users must provide their name, institutional affiliation, institutional contact information, the name of their research project, where the research is taking place, and an acknowledgement that they will not share nor distribute the dataset.

Papers and Posters

Ens, J., & Pasquier, P. (n.d.). Building the MetaMIDI dataset: Linking Symbolic and Audio Musical Data. Proceedings of the 22nd ISMIR Conference. https://archives.ismir.net/ismir2021/paper/000022.pdf

Previous
Previous

GrooveNet

Next
Next

MMM4Live: Multi-Track Music Machine for Ableton Live