pp. 2049-2060
S&M4041 Research Paper of Special Issue https://doi.org/10.18494/SAM5472 Published: May 30, 2025 Heterogeneous Sensor Fusion for Obstacle Localization in Mobile Robot Navigation [PDF] Chung-Hsun Sun, Ying-Shu Chuang, Yao-Yu Tsai, and Hsiang-Chieh Chen (Received November 11, 2024; Accepted April 22, 2025) Keywords: heterogeneous sensor fusion, mobile robot, monocular depth estimation, semantic segmentation
In this study, we focus on the application of heterogeneous sensor fusion technology, which involves the integration of a scanning laser rangefinder (LRF) and a low-cost camera, to detect obstacles in front of a robot. This technology enables real-time obstacle identification and path replanning, allowing the robot to navigate through unknown and cluttered environments efficiently. In indoor navigation, the initial map necessary for path planning is first created using a method related to simultaneous localization and mapping. The two-dimensional location of existing or newly emerged obstacles can be determined by comparing the sensing results of the LRF with the initial map. Furthermore, analyzing captured images using semantic segmentation and monocular depth estimation provides 3D information in the vicinity of a robot. In our work, we leverage the accuracy of an LRF and the high resolution of a visual sensor to effectively integrate heterogeneous sensors, enabling the robot to sense and avoid obstacles. Experimental findings have demonstrated the effectiveness of indoor navigation for a mobile robot in unexplored environments using our proposed sensor fusion technology.
Corresponding author: Hsiang-Chieh Chen![]() ![]() This work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Chung-Hsun Sun, Ying-Shu Chuang, Yao-Yu Tsai, and Hsiang-Chieh Chen, Heterogeneous Sensor Fusion for Obstacle Localization in Mobile Robot Navigation, Sens. Mater., Vol. 37, No. 5, 2025, p. 2049-2060. |