In this paper, we present an emotion recognition framework based on a recurrent neural network with parametric bias (RNNPB) to classify six basic emotions of humans (joy, pride, fear, anger, sadness, and neutral). To capture the expression to recognize emotions, human joint coordinates, angles, and angular velocities are fused in the process of signal preprocessing. A wearable Myo armband and a Kinect sensor are used to collect human joint angular velocities and angles, respectively. Thus, a combined structure of various modalities of subconscious behaviors is presented to improve the classification performance of RNNPB. To this end, two comparative experiments were performed to demonstrate that the performance with the fused data outperforms that of the single modality sensor data from one person. To investigate the robustness of the proposed framework, we further carried out another experiment with the fused data from several people. Six types of emotions can be basically classified using the RNNPB framework according to the recognition results. These experimental results verified the effectiveness of our proposed framework.
CITATION STYLE
Li, J., Zhong, J., & Wang, M. (2020). Unsupervised recurrent neural network with parametric bias framework for human emotion recognition with multimodal sensor data fusion. Sensors and Materials, 32(4), 1261–1277. https://doi.org/10.18494/SAM.2020.2552
Mendeley helps you to discover research relevant for your work.