Creating musical expression using kinect

ISSN: 22204806
19Citations
Citations of this article
64Readers
Mendeley users who have this article in their library.

Abstract

Recently, Microsoft introduced a game interface called Kinect for the Xbox 360 video game platform. This interface enables users to control and interact with the game console without the need to touch a controller. It largely increases the users’ degree of freedom to express their emotion. In this paper, we first describe the system we developed to use this interface for sound generation and controlling musical expression. The skeleton data are extracted from users’ motions and the data are translated to pre-defined MIDI data. We then use the MIDI data to control several applications. To allow the translation between the data, we implemented a simple Kinect-to-MIDI data convertor, which is introduced in this paper. We describe two applications to make music with Kinect: we first generate sound with Max/MSP, and then control the adlib with our own adlib generating system by the body movements of the users.

Cite

CITATION STYLE

APA

Yoo, M. J., Beak, J. W., & Lee, I. K. (2011). Creating musical expression using kinect. In Proceedings of the International Conference on New Interfaces for Musical Expression (pp. 324–325). International Conference on New Interfaces for Musical Expression.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free