Beat synchronous dance animation based on visual analysis of human motion and audio analysis of music tempo

5Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present a framework that generates beat synchronous dance animation based on the analysis of both visual and audio data. First, the articulated motion of a dancer is captured based on markerless visual observations obtained by a multicamera system. We propose and employ a new method for the temporal segmentation of such motion data into the periods of dance. Next, we use a beat tracking algorithm to estimate the pulse related to the tempo of a piece of music. Given an input music that is of the same genre as the one corresponding to the visually observed dance, we automatically produce a beat synchronous dance animation of a virtual character. The proposed approach has been validated with extensive experiments performed on a data set containing a variety on traditional Greek/Cretan dances and the corresponding music. © 2013 Springer-Verlag.

Cite

CITATION STYLE

APA

Panagiotakis, C., Holzapfel, A., Michel, D., & Argyros, A. A. (2013). Beat synchronous dance animation based on visual analysis of human motion and audio analysis of music tempo. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8034 LNCS, pp. 118–127). https://doi.org/10.1007/978-3-642-41939-3_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free