Young Researcher Paper Award 2023
🥇Winners

Notice of retraction
Vol. 34, No. 8(3), S&M3042

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 34, Number 1(3) (2022)
Copyright(C) MYU K.K.
pp. 337-348
S&M2814 Research Paper of Special Issue
https://doi.org/10.18494/SAM3562
Published: January 31, 2022

Indoor Visual Positioning Method Based on Image Features [PDF]

Xun Liu, He Huang, and Bo Hu

(Received July 21, 2021; Accepted October 25, 2021)

Keywords: indoor visual positioning, ORB feature, bag-of-visual-words model, term frequency–inverse document frequency, efficient perspective-n-point

In this study, we propose an indoor visual positioning method based on image features. RGB-D camera data are used to establish an image database used for positioning. The 3D coordinates of pixels are obtained from an RGB image and depth information, and then the oriented fast and rotated brief (ORB) features of the image are extracted. The bag-of-visual-words model is used in combination with the K-means algorithm and a k-dimensional tree structure to classify storage and expressions in the dictionary. In the positioning process, the positioning image is obtained using a camera with known parameters, and the term frequency–inverse document frequency model is used to achieve image feature indexing to match the most similar image. Finally, using the matching feature points in the image, an efficient perspective-n-point method and a bundle adjustment method are used to calculate the camera pose information on the positioning image to complete indoor positioning. Experiments on real scenes verify the feasibility of the proposed method and its positioning accuracy. The results presented in this study provide a useful reference in the research and application of vision-based indoor positioning.

Corresponding author: He Huang


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Cite this article
Xun Liu, He Huang, and Bo Hu, Indoor Visual Positioning Method Based on Image Features, Sens. Mater., Vol. 34, No. 1, 2022, p. 337-348.



Forthcoming Regular Issues


Forthcoming Special Issues

Special Issue on Applications of Novel Sensors and Related Technologies for Internet of Things
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Cheng-Fu Yang (National University of Kaohsiung)
Call for paper


Special Issue on Advanced Sensing Technologies for Green Energy
Guest editor, Yong Zhu (Griffith University)
Call for paper


Special Issue on Room-temperature-operation Solid-state Radiation Detectors
Guest editor, Toru Aoki (Shizuoka University)
Call for paper


Special Issue on International Conference on Biosensors, Bioelectronics, Biomedical Devices, BioMEMS/NEMS and Applications 2023 (Bio4Apps 2023)
Guest editor, Dzung Viet Dao (Griffith University) and Cong Thanh Nguyen (Griffith University)
Conference website
Call for paper


Special Issue on Advanced Sensing Technologies and Their Applications in Human/Animal Activity Recognition and Behavior Understanding
Guest editor, Kaori Fujinami (Tokyo University of Agriculture and Technology)
Call for paper


Special Issue on Signal Collection, Processing, and System Integration in Automation Applications
Guest editor, Hsiung-Cheng Lin (National Chin-Yi University of Technology)
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.