Muse: A music conducting recognition system

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we introduce Music in a Universal Sound Environment(MUSE), a system for gesture recognition in the domain of musical conducting. Our system captures conductors’ musical gestures to drive a MIDI-based music generation system allowing a human user to conduct a fully synthetic orchestra. Moreover, our system also aims to further improve a conductor’s technique in a fun and interactive environment. We describe how our system facilitates learning through a intuitive graphical interface, and describe how we utilized techniques from machine learning and Conga, a finite state machine, to process inputs from a low cost Leap Motion sensor in which estimates the beats patterns that a conductor is suggesting through interpreting hand motions. To explore other beat detection algorithms, we also include a machine learning module that utilizes Hidden Markov Models (HMM) in order to detect the beat patterns of a conductor. An additional experiment was also conducted for future expansion of the machine learning module with Recurrent Neural Networks (rnn) and the results prove to be better than a set of HMMs. MUSE allows users to control the tempo of a virtual orchestra through basic conducting patterns used by conductors in real time. Finally, we discuss a number of ways in which our system can be used for educational and professional purposes.

Cite

CITATION STYLE

APA

Carthen, C. D., Kelley, R., Ruggieri, C., Dascalu, S. M., Colby, J., & Harris, F. C. (2018). Muse: A music conducting recognition system. In Advances in Intelligent Systems and Computing (Vol. 558, pp. 363–369). Springer Verlag. https://doi.org/10.1007/978-3-319-54978-1_49

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free