Musical sonification of avatar physiologies, virtual flight and gesture

3Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Virtual actors moving through interactive game-space environments create rich streams of data that serve as drivers for real-time musical sonification. The paradigms of avian flight, biologically-inspired kinesthetic motion and manually-controlled avatar skeletal mesh components through inverse kinematics are used in the musical performance work ECHO::Canyon to control real-time synthesis-based instruments within a multi-channel sound engine. This paper discusses gestural and control methodologies as well as specific mapping schemata used to link virtual actors with musical characteristics.

Cite

CITATION STYLE

APA

Hamilton, R. (2014). Musical sonification of avatar physiologies, virtual flight and gesture. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8905, pp. 518–532). Springer Verlag. https://doi.org/10.1007/978-3-319-12976-1_31

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free