Music therapy is increasingly being used to promote physical health. Emotion semantic recognition is more objective and provides direct awareness of the real emotional state based on electroencephalogram (EEG) signals. Therefore, we proposed a music therapy method to carry out emotion semantic matching between the EEG signal and music audio signal, which can improve the reliability of emotional judgments, and, furthermore, deeply mine the potential influence correlations between music and emotions. Our proposed EER model (EEG-based Emotion Recognition Model) could identify 20 types of emotions based on 32 EEG channels, and the average recognition accuracy was above 90% and 80%, respectively. Our proposed music-based emotion classification model (MEC model) could classify eight typical emotion types of music based on nine music feature combinations, and the average classification accuracy was above 90%. In addition, the semantic mapping was analyzed according to the influence of different music types on emotional changes from different perspectives based on the two models, and the results showed that the joy type of music video could improve fear, disgust, mania, and trust emotions into surprise or intimacy emotions, while the sad type of music video could reduce intimacy to the fear emotion.
CITATION STYLE
Zhou, T. H., Liang, W., Liu, H., Wang, L., Ryu, K. H., & Nam, K. W. (2023). EEG Emotion Recognition Applied to the Effect Analysis of Music on Emotion Changes in Psychological Healthcare. International Journal of Environmental Research and Public Health, 20(1). https://doi.org/10.3390/ijerph20010378
Mendeley helps you to discover research relevant for your work.