Young Researcher Paper Award 2025
🥇Winners

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 27, Number 8 (2015)
Copyright(C) MYU K.K.
pp. 755-761
S&M1112 Research Paper of Special Issue
https://doi.org/10.18494/SAM.2015.1106
Published: September 7, 2015

Using Laser Range Finder and Multitarget Tracking-Learning-Detection Algorithm for Intelligent Mobile Robot [PDF]

Jr-Hung Guo and Kuo-Lan Su

(Received July 2, 2014; Accepted February 10, 2015)

Keywords: laser ranger finder, multitarget tracking, tracking-learning-detection (TLD), intelligent mobile robot

Self-positioning and obstacle avoidance for an intelligent mobile robot plays an important role in the technologies of the mobile robot. Because of its capabilities of fast response and accurate measurement, the laser range finder (LRF) is a relatively good choice as a sensor for this purpose. However, there are some drawbacks of the LRF. One is that the LRF usually senses in the horizontal plane, and thus the obstacle may not be detected if it is not in the detection plane. Obstacles with high reflectivity will also cause the LRFs to have incorrect detections. Both will lead the mobile robot to a dangerous situation. In this paper, a modified tracking-learning-detection (TLD) image recognition system, which can detect multiple targets simultaneously, is used to assist LRF in positioning and obstacle avoidance.

Corresponding author: Jr-Hung Guo


Cite this article
Jr-Hung Guo and Kuo-Lan Su, Using Laser Range Finder and Multitarget Tracking-Learning-Detection Algorithm for Intelligent Mobile Robot, Sens. Mater., Vol. 27, No. 8, 2015, p. 755-761.



Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Signal Collection, Processing, and System Integration in Automation Applications 2026
Guest editor, Hsiung-Cheng Lin (National Chin-Yi University of Technology), Ming-Te Chen (National Chin-Yi University of Technology), and Chin-Yi Cheng (National Yunlin University of Science and Technology)
Call for paper


Special Issue on Advanced GeoAI for Smart Cities: Novel Data Modeling with Multi-source Sensor Data
Guest editor, Prof. Changfeng Jing (China University of Geosciences Beijing)
Call for paper


Special Issue on Advanced Sensor Application Development
Guest editor, Shih-Chen Shi (National Cheng Kung University) and Tao-Hsing Chen (National Kaohsiung University of Science and Technology)
Call for paper


Special Issue on Mobile Computing and Ubiquitous Networking for Smart Society
Guest editor, Akira Uchiyama (The University of Osaka) and Jaehoon Paul Jeong (Sungkyunkwan University)
Call for paper


Special Issue on Advanced Materials and Technologies for Sensor and Artificial- Intelligence-of-Things Applications (Selected Papers from ICASI 2026)
Guest editor, Sheng-Joue Young (National Yunlin University of Science and Technology)
Conference website
Call for paper


Special Issue on Biosensing Devices
Guest editor, Kiyotaka Sasagawa (Nara Institute of Science and Technology)
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.