pp. 4631-4649
S&M3821 Research Paper of Special Issue https://doi.org/10.18494/SAM5064 Published: November 12, 2024 Self-cure Dual-branch Network for Facial Expression Recognition Based on Visual Sensors [PDF] Dongsheng Wu, Yifan Chen, Yuting Lin, Pengfei Xu, and Dongxu Gao (Received April 4, 2024; Accepted May 31, 2024) Keywords: visual sensors, self-cure network, two-branch method, facial expression recognition
With the rapid development of sensors and sensor technology, facial expression recognition (FER) systems can be developed and applied to real-world scenarios. Vision scan sensors and ambient light sensors capture clear and noise-free images of faces. However, in the real world, annotating large facial expressions is challenging owing to inconsistent labels, which are caused by the annotators’ subjectivity and the facial expressions’ ambiguity. Moreover, current studies present limitations when addressing facial expression differences due to the gender gap. We not only rely on visual sensors for FER but also utilize nonvisual sensors. Therefore, in this paper, we propose a self-cure dual-branch network (SC-DBN) for FER, which automatically prevents deep networks from overfitting ambiguous samples. First, on the basis of SC-DBN, a two-branch training method is designed, taking full advantage of the gender information. Furthermore, a self-attention mechanism highlights the essential samples and weights, each with a regular weighting. Finally, a relabeling module is used to modify the labels of these samples in inconsistent labels. Many experiments on public datasets show that SC-DBN can effectively integrate gendered information and self-cure networks to improve performance.
Corresponding author: Yifan Chen and Dongxu GaoThis work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Dongsheng Wu, Yifan Chen, Yuting Lin, Pengfei Xu, and Dongxu Gao, Self-cure Dual-branch Network for Facial Expression Recognition Based on Visual Sensors, Sens. Mater., Vol. 36, No. 11, 2024, p. 4631-4649. |