Young Researcher Paper Award 2023
🥇Winners

Notice of retraction
Vol. 34, No. 8(3), S&M3042

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 34, Number 12(4) (2022)
Copyright(C) MYU K.K.
pp. 4599-4614
S&M3131 Research Paper of Special Issue
https://doi.org/10.18494/SAM4159
Published: December 26, 2022

Forest Burn Severity Mapping Using Multispectral Unmanned Aerial Vehicle Images and Light Detection and Ranging (LiDAR) Data: Comparison of Maximum Likelihood, Spectral Angle Mapper, and U-Net Classifiers [PDF]

Boknam Lee, Bomi Kim, Choongshik Woo, Geonhwi Jung, Gyeongwon Kwon, and Joowon Park

(Received September 30, 2022; Accepted November 24, 2022)

Keywords: forest burn severity mapping, carbon emission, maximum likelihood, spectral angle mapper, U-Net

The automated mapping of forest burn severity using remote sensing imagery has been popular over the last decade. However, there is a lack of studies examining the performance of a range of classifiers for forest burn severity mapping for different burn severity classes. In this study, the performance of three supervised classifiers, maximum likelihood (ML), spectral angle mapper (SAM), and deep learning (U-Net), was evaluated for mapping forest burn severity under different burn severity class settings (two-level burn severity classes: burned and unburned; four-level burn severity classes: crown fire, heat-damaged, ground fire, and unburned). Multispectral unmanned aerial vehicle (UAV) images and light detection and ranging (LiDAR) points obtained from forest fire areas of Andong in South Korea were used to evaluate burn severity. The results show that all classifiers were capable of mapping the two-level burn severity with high overall accuracy (OA) (SAM: OA = 92.05%, kappa coefficient (K) = 0.84; U-Net: OA = 91.83%, K = 0.83; ML: OA = 90.92%, K = 0.82). For four-level burn severity mapping, U-Net (OA = 79.23%, K = 0.64) outperformed the conventional classifiers of SAM (OA = 50.61%, K = 0.38) and ML (OA = 46.85%, K = 0.34). Regarding class separability, SAM and U-Net showed good performance in detecting the severe burn severity class (crown fire areas), whereas a high rate of misclassification occurred in identifying the moderate burn severity classes (heat-damaged, ground fire) for all classifiers. In particular, ML and SAM showed a low capability in identifying unburned areas, while U-Net showed the lowest capability in mapping heat-damaged and ground fire areas. Overall, our study demonstrated that the reliable mapping of burn severity for Korea’s forest fires largely depends on the number of levels of burn severity classes as well as the classifier’s capability in discriminating moderate burn severity classes.

Corresponding author: Joowon Park


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Cite this article
Boknam Lee, Bomi Kim, Choongshik Woo, Geonhwi Jung, Gyeongwon Kwon, and Joowon Park, Forest Burn Severity Mapping Using Multispectral Unmanned Aerial Vehicle Images and Light Detection and Ranging (LiDAR) Data: Comparison of Maximum Likelihood, Spectral Angle Mapper, and U-Net Classifiers, Sens. Mater., Vol. 34, No. 12, 2022, p. 4599-4614.



Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Applications of Novel Sensors and Related Technologies for Internet of Things
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Cheng-Fu Yang (National University of Kaohsiung)
Call for paper


Special Issue on Advanced Sensing Technologies for Green Energy
Guest editor, Yong Zhu (Griffith University)
Call for paper


Special Issue on Room-temperature-operation Solid-state Radiation Detectors
Guest editor, Toru Aoki (Shizuoka University)
Call for paper


Special Issue on International Conference on Biosensors, Bioelectronics, Biomedical Devices, BioMEMS/NEMS and Applications 2023 (Bio4Apps 2023)
Guest editor, Dzung Viet Dao (Griffith University) and Cong Thanh Nguyen (Griffith University)
Conference website
Call for paper


Special Issue on Advanced Sensing Technologies and Their Applications in Human/Animal Activity Recognition and Behavior Understanding
Guest editor, Kaori Fujinami (Tokyo University of Agriculture and Technology)
Call for paper


Special Issue on Signal Collection, Processing, and System Integration in Automation Applications
Guest editor, Hsiung-Cheng Lin (National Chin-Yi University of Technology)
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.