Young Researcher Paper Award 2023
🥇Winners

Notice of retraction
Vol. 34, No. 8(3), S&M3042

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 35, Number 1(2) (2023)
Copyright(C) MYU K.K.
pp. 167-181
S&M3161 Research Paper of Special Issue
https://doi.org/10.18494/SAM4225
Published: January 31, 2023

A Model of Real-time Pose Estimation Fusing Camera and LiDAR in Simultaneous Localization and Mapping by a Geometric Method [PDF]

De Chen, Qingdong Yan, Zhi Zeng, Junfeng Kang, and Junxiong Zhou

(Received October 31, 2022; Accepted January 16, 2023)

Keywords: light detection and ranging (LiDAR), RGB-D (RGB-depth map), robot, simultaneous localization and mapping (SLAM), pose estimation, minimum bounding rectangle (MBR)

Simultaneous localization and mapping (SLAM) is the key technology for achieving autonomous navigation and stable walking for robots. For addressing a dynamic and special environment indoors and outdoors, there are still some limitations in using a single sensor to estimate and locate a robot’s position and orientation. To further improve the accuracy of SLAM positioning in real time, in this study, we combine the advantages of the RGB-depth map (RGB-D) and light detection and ranging (LiDAR) and propose a model of a two-stage deep fusion framework named convolutional neural network (CNN)–LiDAR vision inertial measurement unit (CNN–LVI) for real-time pose estimation by a geometric method. Unlike existing methods that use either a two-stage framework or multistage pipelines, the proposed framework fuses image and raw 3D point cloud data after multisensor joint calibration, and then uses 3D point clouds as spatial anchors to predict the pose between two sequence frames. By using a CNN algorithm to identify and extract a 3D bounding box, the target object projection of an RGB image is tracked to obtain the target minimum bounding rectangle (MBR). Finally, the rotation angle and translation distance are calculated by a geometric method using the centroid of the target MBR, so as to combine an inertial measurement unit to perform joint optimization, achieve the pose estimation of a robot, and further improve the model’s location accuracy. Experiments show that the proposed model achieves significant performance improvement compared with many other methods in the car class and achieves the best trade-off between state-of-the-art performance and accuracy on the benchmark with the KITTI dataset.

Corresponding author: Qingdong Yan


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Cite this article
De Chen, Qingdong Yan, Zhi Zeng, Junfeng Kang, and Junxiong Zhou, A Model of Real-time Pose Estimation Fusing Camera and LiDAR in Simultaneous Localization and Mapping by a Geometric Method, Sens. Mater., Vol. 35, No. 1, 2023, p. 167-181.



Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Applications of Novel Sensors and Related Technologies for Internet of Things
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Cheng-Fu Yang (National University of Kaohsiung)
Call for paper


Special Issue on Advanced Sensing Technologies for Green Energy
Guest editor, Yong Zhu (Griffith University)
Call for paper


Special Issue on Room-temperature-operation Solid-state Radiation Detectors
Guest editor, Toru Aoki (Shizuoka University)
Call for paper


Special Issue on International Conference on Biosensors, Bioelectronics, Biomedical Devices, BioMEMS/NEMS and Applications 2023 (Bio4Apps 2023)
Guest editor, Dzung Viet Dao (Griffith University) and Cong Thanh Nguyen (Griffith University)
Conference website
Call for paper


Special Issue on Advanced Sensing Technologies and Their Applications in Human/Animal Activity Recognition and Behavior Understanding
Guest editor, Kaori Fujinami (Tokyo University of Agriculture and Technology)
Call for paper


Special Issue on Piezoelectric Thin Films and Piezoelectric MEMS
Guest editor, Isaku Kanno (Kobe University)
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.