Electroencephalography (EEG) signals from each channel mainly reflect activities of the brain region close to the channel position, and the activities cooperated by various brain regions are response to the emotion-induced stimuli. In this paper, temporal, spatial and connective features are extracted from EEG signals gotten around the head, and used for emotion recognition via a proposed model, spatial-temporal-connective muti-scale convolutional neural network (STC-CNN). The channel-to-channel connectivity is gotten to describe brain region-to-region cooperation under emotion stimuli. The model obtained an average accuracy of 96.79% and 96.89% in classifying the two emotional dimensions of valence and arousal.
CITATION STYLE
Li, T., Fu, B., Wu, Z., & Liu, Y. (2023). EEG-Based Emotion Recognition Using Spatial-Temporal-Connective Features via Multi-Scale CNN. IEEE Access, 11, 41859–41867. https://doi.org/10.1109/ACCESS.2023.3270317
Mendeley helps you to discover research relevant for your work.