This work aims at defining a computational model of human emotional entrainment. Music, as a non-verbal language to express emotions, is chosen as an ideal test bed for these aims. We start from multimodal gesture and motion signals, recorded in a real world collaborative condition in an ecological setting. Four violin players were asked to play, alone or in duo, a music fragment in two different perceptual feedback modalities and in four different emotional states. We focused our attention on Phase Synchronisation of the head motions of the players. From observation by subjects (musicians and observers), an evidence of entrainment emerges between players. The preliminary results, based on a reduced data set, however do not grasp fully this phenomenon. A more extended analysis is current subject of investigation.
Mendeley saves you time finding and organizing research
Choose a citation style from the tabs below