Coupled Projection Transfer Metric Learning for Cross-Session Emotion Recognition from EEG

10Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Distribution discrepancies between different sessions greatly degenerate the performance of video-evoked electroencephalogram (EEG) emotion recognition. There are discrepancies since the EEG signal is weak and non-stationary and these discrepancies are manifested in different trails in each session and even in some trails which belong to the same emotion. To this end, we propose a Coupled Projection Transfer Metric Learning (CPTML) model to jointly complete domain alignment and graph-based metric learning, which is a unified framework to simultaneously minimize cross-session and cross-trial divergences. By experimenting on the SEED_IV emotional dataset, we show that (1) CPTML exhibits a significantly better performance than several other approaches; (2) the cross-session distribution discrepancies are minimized and emotion metric graph across different trials are optimized in the CPTML-induced subspace, indicating the effectiveness of data alignment and metric exploration; and (3) critical EEG frequency bands and channels for emotion recognition are automatically identified from the learned projection matrices, providing more insights into the occurrence of the effect.

Cite

CITATION STYLE

APA

Shen, F., Peng, Y., Dai, G., Lu, B., & Kong, W. (2022). Coupled Projection Transfer Metric Learning for Cross-Session Emotion Recognition from EEG. Systems, 10(2). https://doi.org/10.3390/systems10020047

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free