Abstract
AMUSE (Advanced MUSic Explorer) was created 2006 as an open-source Java framework for various music information retrieval tasks like feature extraction, feature processing, classification, and evaluation. In contrast to toolboxes which focus on individual MIR-related algorithms, it is possible with AMUSE, for instance, to extract features with Librosa, process them based on events estimated by MIRtoolbox, classify with WEKA or Keras, and validate the models with own classification performance measures. We present several substantial contributions to AMUSE since its first presentation at ISMIR 2010. They include the annotation editor for single and multiple tracks, the support of multi-label and multi-class classification, and new plugins which operate with Keras, Librosa, and Sonic Annotator. Other integrated methods are the structural complexity processing, chord vector feature, aggregation of features around estimated onset events, and evaluation of time event extractors. Further advancements are a more flexible feature extraction with different parameters like frame sizes, possibility to integrate additional tasks beyond algorithms related to supervised classification, marking of features which can be ignored for a classification task, extension of algorithm parameters with external code (e.g., a structure of a Keras neural net), etc.
Author supplied keywords
Cite
CITATION STYLE
Vatolkin, I., Ginsel, P., & Rudolph, G. (2021). Advancements in the Music Information Retrieval Framework AMUSE over the Last Decade. In SIGIR 2021 - Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 2383–2389). Association for Computing Machinery, Inc. https://doi.org/10.1145/3404835.3463252
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.