Learning Adaptive Game Soundtrack Control

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we demonstrate a novel technique for dynamically generating an emotionally-directed video game soundtrack. We begin with a human Conductor observing gameplay and directing associated emotions that would enhance the observed gameplay experience. We apply supervised learning to data sampled from synchronized input gameplay features and Conductor output emotional direction features in order to fit a mathematical model to the Conductor's emotional direction. Then, during gameplay, the emotional direction model maps gameplay state input to emotional direction output, which is then input to a music generation module that dynamically generates emotionally-relevant music during gameplay. Our empirical study suggests that random forests serve well for modeling the Conductor for our two experimental game genres.

Cite

CITATION STYLE

APA

Dorsey, A., Neller, T. W., Tran, H. G., & Yilmaz, V. (2023). Learning Adaptive Game Soundtrack Control. In Proceedings of the 37th AAAI Conference on Artificial Intelligence, AAAI 2023 (Vol. 37, pp. 16070–16077). AAAI Press. https://doi.org/10.1609/aaai.v37i13.26909

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free