Mixed Reality Musical Interface: Exploring Ergonomics and Adaptive Hand Pose Recognition for Gestural Control

9Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

The study of extended reality musical instruments is a burgeoning topic in the field of new interfaces for musical expression. We developed a mixed reality musical interface (MRMI) as a technology probe to inspire design for experienced musicians. We namely explore (i) the ergonomics of the interface in relation to musical expression and (ii) user-adaptive hand pose recognition as gestural control. The MRMI probe was experienced by 10 musician participants (mean age: 25.6 years [SD=3.0], 6 females, 4 males). We conducted a user evaluation comprising three stages. After an experimentation period, participants were asked to accompany a pre-recorded piece of music. In a post-task stage, participants took part in semi-structured interviews, which were subjected to thematic analysis. Prevalent themes included reducing the size of the interface, issues with the field of view of the device and physical strain from playing. Participants were largely in favour of hand poses as expressive control, although this depended on customisation and temporal dynamics; the use of interactive machine learning (IML) for user-adaptive hand pose recognition was well received by participants.

Cite

CITATION STYLE

APA

Graf, M., & Barthet, M. (2022). Mixed Reality Musical Interface: Exploring Ergonomics and Adaptive Hand Pose Recognition for Gestural Control. In Proceedings of the International Conference on New Interfaces for Musical Expression. International Conference on New Interfaces for Musical Expression. https://doi.org/10.21428/92fbeb44.56ba9b93

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free