pp. 3131-3145
S&M3722 Research Paper of Special Issue https://doi.org/10.18494/SAM5182 Published: July 31, 2024 Remote Sensing Image Recognition of Dust Cover Net Construction Waste: A Method Combining Convolutional Block Attention Module and U-Net [PDF] Shangwei Lv, Xiaoyu Liu, and Yifei Cao (Received June 10, 2024; Accepted July 16, 2024) Keywords: construction waste, attention mechanism, U-net, CBAM, semantic segmentation
With the acceleration of urban development, the annual production of urban construction waste has been increasing yearly, which brings considerable challenges for urban supervision and management, and how to quickly and accurately identify construction waste is of great practical significance. In this paper, we propose a remote sensing image dust cover net construction waste recognition algorithm based on the improved U-network model to realize construction waste target recognition. The algorithm first prepares a dust cover net construction waste identification dataset using Google high-resolution remote sensing imagery as the database. Second, VGG16 is adopted as the backbone network of the U-Net model to improve the feature expression ability of the model. Finally, the Convolution Block Attention Module (CBAM) is embedded into the U-Net network to construct the CBAM-U-Net model to enhance the information extraction accuracy of high-resolution remote sensing images. With the remote sensing image encompassing Daxing District in Beijing as an example, the results show that the proposed algorithm can automatically and efficiently recognize the dust cover net construction waste with 95.51% recognition accuracy and 95.08% MIou, which puts forward a new idea for the supervision of construction waste.
Corresponding author: Shangwei LvThis work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Shangwei Lv, Xiaoyu Liu, and Yifei Cao, Remote Sensing Image Recognition of Dust Cover Net Construction Waste: A Method Combining Convolutional Block Attention Module and U-Net, Sens. Mater., Vol. 36, No. 7, 2024, p. 3131-3145. |