Virtual actors moving through interactive game-space environments create rich streams of data that serve as drivers for real-time musical sonification. The paradigms of avian flight, biologically-inspired kinesthetic motion and manually-controlled avatar skeletal mesh components through inverse kinematics are used in the musical performance work ECHO::Canyon to control real-time synthesis-based instruments within a multi-channel sound engine. This paper discusses gestural and control methodologies as well as specific mapping schemata used to link virtual actors with musical characteristics.
CITATION STYLE
Hamilton, R. (2014). Musical sonification of avatar physiologies, virtual flight and gesture. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8905, pp. 518–532). Springer Verlag. https://doi.org/10.1007/978-3-319-12976-1_31
Mendeley helps you to discover research relevant for your work.