Visualizing emotion in musical performance using a virtual character

10Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We describe an immersive music visualization application which enables interaction between a live musician and a responsive virtual character. The character reacts to live performance in such a way that it appears to be experiencing an emotional response to the music it 'hears.' We modify an existing tonal music encoding strategy in order to define how the character perceives and organizes musical information. We reference existing research correlating musical structures and composers' emotional intention in order to simulate cognitive processes capable of inferring emotional meaning from music. The ANIMUS framework is used to define a synthetic character who visualizes its perception and cognition of musical input by exhibiting responsive behaviour expressed through animation. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Taylor, R., Boulanger, P., & Torres, D. (2005). Visualizing emotion in musical performance using a virtual character. In Lecture Notes in Computer Science (Vol. 3638, pp. 13–24). Springer Verlag. https://doi.org/10.1007/11536482_2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free