Gesture interaction for electronic music performance

0Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper describes an approach for a system which analyses an orchestra conductor in real-time, with the purpose of using the extracted information of time pace and expression for an automatic play of a computer-controlled instrument (synthesizer). The system in its final stage will use non-intrusive computer vision methods to track the hands of the conductor. The main challenge is to interpret the motion of the hand/baton/mouse as beats for the timeline. The current implementation uses mouse motion to simulate the movement of the baton. It allows to "conduct" a pre-stored MIDI file of a classical orchestral music work on a PC. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Behringer, R. (2007). Gesture interaction for electronic music performance. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4552 LNCS, pp. 564–572). Springer Verlag. https://doi.org/10.1007/978-3-540-73110-8_61

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free