Young Researcher Paper Award 2023
🥇Winners

Notice of retraction
Vol. 34, No. 8(3), S&M3042

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 36, Number 9(3) (2024)
Copyright(C) MYU K.K.
pp. 3947-3956
S&M3777 Research Paper of Special Issue
https://doi.org/10.18494/SAM5267
Published: September 30, 2024

Improving Performance of Instance Segmentation Model for Building Object Detection Using Contrastive Unpaired Translation [PDF]

Seung Bae Jeon, Gyusang Kim, Minjae Choi, and Myeong-Hun Jeong

(Received August 8, 2024; Accepted September 9, 2024)

Keywords: contrastive unpaired translation, building segmentation, image translation

With the advancements in data collection processes and sensors, a vast amount of data is now available, driving the increasing application and utilization of deep-learning-based artificial intelligence technologies. For instance, object detection through image data is utilized in various fields such as traffic safety, crime prevention and public safety, environmental monitoring, and disaster response in urban areas. Deep-learning-based object detection models exhibit high performance, but there are limitations in performance when the training data is restricted. There are issues with degraded object detection performance owing to differences in environmental factors (occlusion and illumination) in the data collected under different solar altitudes or shooting conditions from the training data. The aim of this study is to enhance the performance of object segmentation by mitigating different environmental factors between the training and collected data using the contrastive-unpaired-translation (CUT) algorithm, one of the image-to-image translation algorithms. In this study, we aim to improve object segmentation performance by generating images under environmental conditions (e.g., shadows and shading) similar to those of the training data. The object segmentation model used in this study was You Only Look Once version 8, and instance segmentation was performed by inputting data with and without applying CUT. The results showed an improvement of approximately 11.11% in mAP@50. Furthermore, statistical verification confirmed that this is a significant difference. The results of this study confirmed the potential of improving instance segmentation performance through image translation techniques, which will contribute to autonomous driving and unmanned services.

Corresponding author: Myeong-Hun Jeong


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Cite this article
Seung Bae Jeon, Gyusang Kim, Minjae Choi, and Myeong-Hun Jeong , Improving Performance of Instance Segmentation Model for Building Object Detection Using Contrastive Unpaired Translation , Sens. Mater., Vol. 36, No. 9, 2024, p. 3947-3956.



Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Applications of Novel Sensors and Related Technologies for Internet of Things
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Cheng-Fu Yang (National University of Kaohsiung)
Call for paper


Special Issue on Advanced Sensing Technologies for Green Energy
Guest editor, Yong Zhu (Griffith University)
Call for paper


Special Issue on Room-temperature-operation Solid-state Radiation Detectors
Guest editor, Toru Aoki (Shizuoka University)
Call for paper


Special Issue on International Conference on Biosensors, Bioelectronics, Biomedical Devices, BioMEMS/NEMS and Applications 2023 (Bio4Apps 2023)
Guest editor, Dzung Viet Dao (Griffith University) and Cong Thanh Nguyen (Griffith University)
Conference website
Call for paper


Special Issue on Advanced Sensing Technologies and Their Applications in Human/Animal Activity Recognition and Behavior Understanding
Guest editor, Kaori Fujinami (Tokyo University of Agriculture and Technology)
Call for paper


Special Issue on Piezoelectric Thin Films and Piezoelectric MEMS
Guest editor, Isaku Kanno (Kobe University)
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.