This paper describes an approach for a system which analyses an orchestra conductor in real-time, with the purpose of using the extracted information of time pace and expression for an automatic play of a computer-controlled instrument (synthesizer). The system in its final stage will use non-intrusive computer vision methods to track the hands of the conductor. The main challenge is to interpret the motion of the hand/baton/mouse as beats for the timeline. The current implementation uses mouse motion to simulate the movement of the baton. It allows to "conduct" a pre-stored MIDI file of a classical orchestral music work on a PC. © Springer-Verlag Berlin Heidelberg 2007.
CITATION STYLE
Behringer, R. (2007). Gesture interaction for electronic music performance. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4552 LNCS, pp. 564–572). Springer Verlag. https://doi.org/10.1007/978-3-540-73110-8_61
Mendeley helps you to discover research relevant for your work.