pp. 2261-2275
S&M2256 Research Paper of Special Issue https://doi.org/10.18494/SAM.2020.2870 Published: July 10, 2020 Visual Odometry Implementation and Accuracy Evaluation Based on Real-time Appearance-based Mapping [PDF] Bo Hu and He Huang (Received May 15, 2019; Accepted May 20, 2020) Keywords: visual odometry, RTAB-MAP, feature detection algorithms, RANSAC, accuracy evaluation
With the rapid development of artificial intelligence and machine learning technology, various robots serving humans have emerged. The positioning and navigation technology of robots has become a research hotspot. Robots relying on visual odometry (VO) are favored by people for their low price and wide range of applications. In this paper, we focus on the algorithm and implementation of VO based on the feature point method. Firstly, the current mainstream feature detection algorithms are compared and analyzed in terms of real-time performance and time efficiency. The random sample consensus algorithm is used to eliminate the mismatch in the image feature matching process. Secondly, using the Kinect vision sensor and the Turtlebot mobile robot system to build the RGB-D simultaneous localization and mapping (SLAM) experimental platform, the real-time appearance-based mapping (RTAB-MAP)-based VO is realized. Finally, to verify the actual running performance of the VO, a series of motion experiments were performed in an indoor environment. The accuracy of the pose estimation of the VO was evaluated, which provides a useful reference for the development of mobile robot positioning technology.
Corresponding author: He HuangThis work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Bo Hu and He Huang, Visual Odometry Implementation and Accuracy Evaluation Based on Real-time Appearance-based Mapping, Sens. Mater., Vol. 32, No. 7, 2020, p. 2261-2275. |