Young Researcher Paper Award 2021
🥇Winners

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Copyright(C) MYU K.K.

Development of System for Collecting User-specified Training Data for Autonomous Driving Based on Virtual Road Environment

Min-Soo Kim and In-Sung Jang

(Received April 30, 2022; Accepted July 5, 2022)

Keywords: autonomous driving, high definition road, virtual environment, training data, deep learning

Deep learning technologies that use road images to recognize autonomous driving environments have been actively developed. Such deep-learning-based autonomous driving technology needs a large amount of training data that can represent various road, traffic, and weather environments. However, there have been many difficulties in terms of time and cost in collecting training data that can represent various road environments. Therefore, in this study we attempt to build a virtual road environment and develop a system for collecting training data based on the virtual environment. To build a virtual environment identical to the real world, we convert and use two kinds of existing geospatial data: high-definition 3D buildings and high-definition roads. We also develop a system for collecting training data running in the virtual environment. The implementation results of the proposed system show that it is possible to build a virtual environment identical to the real world and to collect specific training data quickly and at any time from the virtual environment with various user-specified settings.

Corresponding author: Min-Soo Kim




Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Biosensors and Biofuel Cells for Smart Community and Smart Life
Guest editor, Seiya Tsujimura (University of Tsukuba), Isao Shitanda (Tokyo University of Science), and Hiroaki Sakamoto (University of Fukui)


Special Issue on the International Multi-Conference on Engineering and Technology Innovation 2021 (IMETI2021)
Guest editor, Wen-Hsiang Hsieh (National Formosa University)
Conference website


Special Issue on Novel Sensors and Related Technologies on IoT Applications: Part 1-2
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Cheng-Fu Yang (National University of Kaohsiung)
Call for paper


Special Issue on Advanced Micro/Nanomaterials for Various Sensor Applications (Selected Papers from ICASI 2021)
Guest editor, Sheng-Joue Young (National United University), Shoou-Jinn Chang (National Cheng Kung University), Liang-Wen Ji (National Formosa University), and Yu-Jen Hsiao (Southern Taiwan University of Science and Technology)
Conference website
Call for paper


Special Issue on Advanced Technologies for Remote Sensing and Geospatial Analysis: Part 3
Guest editor, Dong Ha Lee (Kangwon National University) and Myeong Hun Jeong (Chosun University)
Call for paper


Special Issue on APCOT 2022
Guest editor, Yuelin Wang, Tie Li (Shanghai Institute of Microsystem and Information Technology) and Qingan Huang (Southeast University)
Conference website
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.