The paper describes an interactive musical system that utilizes a genetic algorithm in an effort to create inspiring collaborations between human musicians and an improvisatory robotic xylophone player. The robot is designed to respond to human input in an acoustic and visual manner, evolving a human-generated phrase population based on a similarity driven fitness function in real time. The robot listens to MIDI and audio input from human players and generates melodic responses that are informed by the analyzed input as well as by internalized knowledge of contextually relevant material. The paper describes the motivation for the project, the hardware and software design, two performances that were conducted with the system, and a number of directions for future work. © 2008 Springer-Verlag Berlin Heidelberg.
CITATION STYLE
Weinberg, G., Godfrey, M., Rae, A., & Rhoads, J. (2008). A real-time genetic algorithm in human-robot musical improvisation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4969 LNCS, pp. 351–359). https://doi.org/10.1007/978-3-540-85035-9_24
Mendeley helps you to discover research relevant for your work.