Joint estimation of chords and downbeats from an audio signal

54Citations
Citations of this article
53Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present a new technique for joint estimation of the chord progression and the downbeats from an audio file. Musical signals are highly structured in terms of harmony and rhythm. In this paper, we intend to show that integrating knowledge of mutual dependencies between chords and metric structure allows us to enhance the estimation of these musical attributes. For this, we propose a specific topology of hidden Markov models that enables modelling chord dependence on metric structure. This model allows us to consider pieces with complex metric structures such as beat addition, beat deletion or changes in the meter. The model is evaluated on a large set of popular music songs from the Beatles that present various metric structures. We compare a semi-automatic model in which the beat positions are annotated, with a fully automatic model in which a beat tracker is used as a front-end of the system. The results show that the downbeat positions of a music piece can be estimated in terms of its harmonic structure and that conversely the chord progression estimation benefits from considering the interaction between the metric and the harmonic structures. © 2010 IEEE.

Cite

CITATION STYLE

APA

Papadopoulos, H., & Peeters, G. (2011). Joint estimation of chords and downbeats from an audio signal. IEEE Transactions on Audio, Speech and Language Processing, 19(1), 138–152. https://doi.org/10.1109/TASL.2010.2045236

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free