This paper describes work towards the goal of enabling unscripted interaction with non-player characters in virtual environments. We hypothesize that we can define a layer of social affordances, based on physical and non-verbal signals exchanged between individuals and groups, which can be reused across games. We have implemented a first version of that substrate that employs whole body interaction with virtual characters and generates nuanced, real-time character performance in response. We describe the playable experience produced by the system, the implementation architecture (based on the behavior specification technology used in Façade, the social model employed in Prom Week, and gesture recognition technology), and illustrate the key behaviors and programming idioms that enable character performance. These idioms include orthogonal coding of attitudes and activities, use of relational rules to nominate social behavior, use of volition rules to rank options, and priority based interleaving of character animations.
CITATION STYLE
Shapiro, D., McCoy, J., Grow, A., Samuel, B., Stern, A., Swanson, R., … Mateas, M. (2013). Creating playable social experiences through whole-body interaction with virtual characters. In Proceedings of the 9th AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, AIIDE 2013 (pp. 79–82). AAAI press. https://doi.org/10.1609/aiide.v9i1.12691
Mendeley helps you to discover research relevant for your work.