This paper describes the design, theoretical underpinnings and development of a hyperinstrumental performance system driven by gestural data obtained from an electric guitar. The system combines a multichannel audio feed (parsed for its pitch contour, spectral content and note inter–onset time data) with motion tracking of the performer’s larger–scale bodily movements using a Microsoft Xbox Kinect sensor. These gestural materials provide the basis for the system’s musical mapping strategies, informed by an integration of embodied cognitive models with electroacoustic/electronic music theory (specifically, Smalley’s spectromorphology). The performance system’s sound processing is further animated using the boids flocking algorithm by Reynolds. This provides an embodied/ecological base for connecting Lerdahl’s spatial and syntactical models of tonal harmony with sound spatialization and textural processing. Through this work, we aim to advance broadly applicable performance gesture ecologies, providing typologies that facilitate creative (but still coherent) mappings from physical and figurative performance gestures to spatial and textural structures.
CITATION STYLE
Graham, R., & Bridges, B. (2014). Gesture and embodied metaphor in spatial music performance systems design. In Proceedings of the International Conference on New Interfaces for Musical Expression (pp. 581–584). International Conference on New Interfaces for Musical Expression.
Mendeley helps you to discover research relevant for your work.