pp. 2463-2487
S&M4067 Research Paper of Special Issue https://doi.org/10.18494/SAM5557 Published: June 25, 2025 Debris Pattern Recognition Based on Visual Sensor and Image Stitching Technology [PDF] Wei-Yuan Zheng and Jih-Gau Juang (Received January 15, 2025; Accepted June 5, 2025) Keywords: image stitching, object detection, deep learning
In this study, we applied multiple unmanned aerial vehicles (UAVs) and visual sensors with deep learning neural networks, You Only Look Once (YOLO), to quickly and effectively recognize significant areas of debris. Information on debris locations, area sizes, and images is sent to the monitoring system. Then, the debris distribution is analyzed, and the source of the debris can be found. The pattern recognition process uses a variety of feature detection methods, description, and point matching for real-time image stitching of the scene. The UAVs can obtain large-area scene images and check whether undetected debris exists. A comparison with different YOLO models is given. The effects of debris recognition and the consequences of various types of data and image stitching during the image stitching process are applied to analyze the real-time image stitching effects by different methods.
Corresponding author: Jih-Gau Juang![]() ![]() This work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Wei-Yuan Zheng and Jih-Gau Juang, Debris Pattern Recognition Based on Visual Sensor and Image Stitching Technology, Sens. Mater., Vol. 37, No. 6, 2025, p. 2463-2487. |