Monte Carlo methods for tempo tracking and rhythm quantization

81Citations
Citations of this article
71Readers
Mendeley users who have this article in their library.

Abstract

We present a probabilistic generative model for timing deviations in expressive music performance. The structure of the proposed model is equivalent to a switching state space model. The switch variables correspond to discrete note locations as in a musical score. The continuous hidden variables denote the tempo. We formulate two well known music recognition problems, namely tempo tracking and automatic transcription (rhythm quantization) as filtering and maximum a posteriori (MAP) state estimation tasks. Exact computation of posterior features such as the MAP state is intractable in this model class, so we introduce Monte Carlo methods for integration and optimization. We compare Markov Chain Monte Carlo (MCMC) methods (such as Gibbs sampling, simulated annealing and iterative improvement) and sequential Monte Carlo methods (particle filters). Our simulation results suggest better results with sequential methods. The methods can be applied in both online and batch scenarios such as tempo tracking and transcription and are thus potentially useful in a number of music applications such as adaptive automatic accompaniment, score typesetting and music information retrieval. © 2003 AI Access Foundation and Morgan Kaufmann Publishers. All rights reserved.

Cite

CITATION STYLE

APA

Cemgil, A. T., & Kappen, B. (2003). Monte Carlo methods for tempo tracking and rhythm quantization. Journal of Artificial Intelligence Research. American Association for Artificial Intelligence. https://doi.org/10.1613/jair.1121

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free