pp. 4117-4129
S&M3471 Research Paper of Special Issue https://doi.org/10.18494/SAM4592 Published: December 15, 2023 Real-time Hand Movement Trajectory Tracking with Deep Learning [PDF] Po-Tong Wang, Jia-Shing Sheu, and Chih-Fang Shen (Received July 15, 2023; Accepted November 9, 2023) Keywords: real-time hand tracking, deep learning, single-shot multibox detector (SSD), CAMShift, object detection, human–computer interaction (HCI)
In this study, we employed deep learning to develop a real-time hand trajectory tracking system. Our primary approach integrates the MobileNetv2 single-shot multibox detector, known for accuracy, with the versatile CAMShift algorithm. This synergy ensures robust hand detection across diverse scenarios. Through rigorous testing on webcam images and leveraging advanced feature extraction methods, such as contour discernment and skin hue differentiation, we report an 88.17% increase in detection accuracy over traditional models. Moreover, with a latency of merely 0.0343 s, our system demonstrates its prowess in immersive gaming and assistive devices for individuals with disabilities
Corresponding author: Po-Tong WangThis work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Po-Tong Wang, Jia-Shing Sheu, and Chih-Fang Shen, Real-time Hand Movement Trajectory Tracking with Deep Learning, Sens. Mater., Vol. 35, No. 12, 2023, p. 4117-4129. |