EEG-based classification from human states is still challenging in human-computer interaction (HCI). Since it reflects brain activity directly, electroencephalography (EEG) has significant advantages in emotion classification research. This study recorded EEG signals from 12 participants while they perceived emotional audio-visual stimulation movie clips for four minutes. The six emotions studied were anger, excitement, fear, happiness, sadness, and a neutral state. We also perform raw data preprocessing to obtain clean data, extracting the power spectrum using Fast Fourier Transform (FFT) to generate feature vectors. In addition, we conduct extensive experiments to validate the classification of human states using subject-independent machine-learning techniques. As a result, the LSTM network achieved the highest classification accuracy of 81.46% for six emotional states, while the SVM network achieved only 68.64%. In addition, we achieved 82.89% accuracy in the Bi-LSTM network with two layers when applying the deep learning methods to different layers. In conclusion, extensive experiments were conducted on our collected dataset. Experimental results indicate that the LSTM network, a time-sequence-related model, has superior classification results of 82.89% than other methods. It also shows that the long duration of EEG signals is crucial for detecting the emotional state changes of various subject types, including individual subjects and cross-subjects. In future research, we will investigate multiple evaluation experiments utilizing deep learning models or propose novel EEG-based emotion classification features to improve the accuracy of emotion classification.
CITATION STYLE
Lee, W., & Son, G. (2024). Investigation of human state classification via EEG signals elicited by emotional audio-visual stimulation. Multimedia Tools and Applications, 83(29), 73217–73231. https://doi.org/10.1007/s11042-023-16294-w
Mendeley helps you to discover research relevant for your work.