Emotion Recognition from EEG Using Rhythm Synchronization Patterns with Joint Time-Frequency-Space Correlation

8Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recently there has attracted wide attention in EEG-based emotion recognition (ER), which is one of the utilization of Brain Computer Interface (BCI). However, due to the ambiguity of human emotions and the complexity of EEG signals, the EEG-ER system which can recognize emotions with high accuracy is not easy to achieve. In this paper, by combining discrete wavelet transform, correlation analysis, and neural network methods, we propose an Emotional Recognition model based on rhythm synchronization patterns to distinguish the emotional stimulus responses to different emotional audio and video. In this model, the entire scalp conductance signal is analyzed from a joint time-frequency-space correlation, which is beneficial to the depth learning and expression of affective pattern, and then improve the accuracy of recognition. The accuracy of the proposed multi-layer EEG-ER system is compared with various feature extraction methods. For analysis results, average and maximum classification rates of 64% and 67.0% were obtained for arousal and 66.6% and 76.0% for valence.

Cite

CITATION STYLE

APA

Kuai, H., Xu, H., & Yan, J. (2017). Emotion Recognition from EEG Using Rhythm Synchronization Patterns with Joint Time-Frequency-Space Correlation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10654 LNAI, pp. 159–168). Springer Verlag. https://doi.org/10.1007/978-3-319-70772-3_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free