pp. 2329-2341
S&M2262 Research Paper of Special Issue https://doi.org/10.18494/SAM.2020.2881 Published: July 10, 2020 Classification of Restlessness Level by Deep Learning of Visual Geometry Group Convolution Neural Network with Acoustic Speech and Visual Face Sensor Data for Smart Care Applications [PDF] Ing-Jr Ding and Nai-Wei Zheng (Received June 27, 2019; Accepted June 1, 2020) Keywords: restlessness classification, VGG-16 CNN, VGG-19 CNN, acoustic speech, visual face
Recently, acoustic speech recognition and visual face identification have become mature techniques widely used in real-life applications. However, human cognitive recognition issues such as human emotion classification are still a major challenge. In this study, restlessness level recognition using a deep learning scheme of the Visual Geometry Group (VGG) convolution neural network (CNN) with input acoustic speech and visual face sensor data is presented for home care applications. The well-known Microsoft Kinect device is employed with a red–green–blue sensor and an array of microphones to acquire facial expression and vocal variation data, respectively. Both VGG-16 and VGG-19 CNN deep learning models are used to evaluate the effectiveness of restlessness level classification in three different data modality inputs: acoustic speech observations alone, visual face observations alone, and combined speech and face observations. Experimental results on categorizing nine defined restlessness levels demonstrate the effectiveness of the presented approach. A specific group with problems of restlessness can benefit from the immediate care that can be provided intelligently by using the system proposed in this study.
Corresponding author: Ing-Jr DingThis work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Ing-Jr Ding and Nai-Wei Zheng, Classification of Restlessness Level by Deep Learning of Visual Geometry Group Convolution Neural Network with Acoustic Speech and Visual Face Sensor Data for Smart Care Applications, Sens. Mater., Vol. 32, No. 7, 2020, p. 2329-2341. |