pp. 1261-1277
S&M2175 Research Paper of Special Issue https://doi.org/10.18494/SAM.2020.2552 Published: April 10, 2020 Unsupervised Recurrent Neural Network with Parametric Bias Framework for Human Emotion Recognition with Multimodal Sensor Data Fusion [PDF] Jie Li, Junpei Zhong, and Min Wang (Received August 9, 2019; Accepted October 16, 2019) Keywords: emotion recognition, multimodal sensors, recurrent neural network, subconscious behaviors
In this paper, we present an emotion recognition framework based on a recurrent neural network with parametric bias (RNNPB) to classify six basic emotions of humans (joy, pride, fear, anger, sadness, and neutral). To capture the expression to recognize emotions, human joint coordinates, angles, and angular velocities are fused in the process of signal preprocessing. A wearable Myo armband and a Kinect sensor are used to collect human joint angular velocities and angles, respectively. Thus, a combined structure of various modalities of subconscious behaviors is presented to improve the classification performance of RNNPB. To this end, two comparative experiments were performed to demonstrate that the performance with the fused data outperforms that of the single modality sensor data from one person. To investigate the robustness of the proposed framework, we further carried out another experiment with the fused data from several people. Six types of emotions can be basically classified using the RNNPB framework according to the recognition results. These experimental results verified the effectiveness of our proposed framework.
Corresponding author: Min WangThis work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Jie Li, Junpei Zhong, and Min Wang, Unsupervised Recurrent Neural Network with Parametric Bias Framework for Human Emotion Recognition with Multimodal Sensor Data Fusion, Sens. Mater., Vol. 32, No. 4, 2020, p. 1261-1277. |