pp. 247-268
S&M3166 Review Paper of Special Issue https://doi.org/10.18494/SAM4211 Published: January 31, 2023 A Review of Indoor Automation Modeling Based on Light Detection and Ranging Point Clouds [PDF] Yang Cui, Bogang Yang, Peng Liu, and Lingyan Kong (Received October 30, 2022; Accepted January 10, 2023) Keywords: 3D indoor modeling, laser scanning sensor, standards, point cloud acquisition and characteristics, object classification, room segmentation, model reconstruction
3D modeling of the indoor environment is essential for urban applications such as indoor navigation, emergency simulations, floor planning, and building construction. With the development of laser scanning sensors, 3D laser scanners can quickly obtain high-density, high-precision 3D coordinates and attribute information, which brings significant advantages in collecting 3D information on indoor scenes. Many studies have been published on the fast reconstruction of 3D models based on point cloud data obtained by various types of laser scanning sensors. In this paper, we review state-of-the-art automated 3D indoor modeling technologies. The 3D modeling standards for indoor environments are introduced, and data acquisition based on laser scanning sensors and characteristics of point clouds are discussed. Indoor object classification and indoor room segmentation are also examined in detail. The 3D indoor reconstruction methods (i.e., line-based, plane-based, and volume-based) are systematically introduced and the advantages and disadvantages of these methods are presented. Future research directions in this field are discussed and summarized. This review can help researchers improve current approaches or develop new techniques for 3D indoor reconstruction.
Corresponding author: Bogang YangThis work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Yang Cui, Bogang Yang, Peng Liu, and Lingyan Kong, A Review of Indoor Automation Modeling Based on Light Detection and Ranging Point Clouds, Sens. Mater., Vol. 35, No. 1, 2023, p. 247-268. |