BrainiBeats: A dual brain-computer interface for musical composition using inter-brain synchrony and emotional valence

2Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A dual brain-computer interface (BCI) was developed to translate emotions and synchrony between two users into music. Using EEG signals of two individuals, the system generates live music note-by-note and controls musical parameters, such as pitch, intensity and interval. The users' mean EEG amplitude determines the notes, and their emotional valence modulates the intensity (i.e. volume of music). Additionally, inter-brain synchrony is used to manipulate the interval between notes, with higher synchrony producing more pleasant music and lower synchrony producing less pleasant music. Further research is needed to test the system in an experimental setting, however, literature suggests that neurofeedback based on inter-brain synchrony and emotional valence could be used to promote positive aspects of group dynamics and mutual emotional understanding.

Cite

CITATION STYLE

APA

Ceccato, C., Pruss, E., Vrins, A., Prinsen, J., & Alimardani, M. (2023). BrainiBeats: A dual brain-computer interface for musical composition using inter-brain synchrony and emotional valence. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3544549.3585910

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free