Young Researcher Paper Award 2023

Notice of retraction
Vol. 34, No. 8(3), S&M3042

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語


 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.

MYU Research

(proofreading and recording)

(translation service)

The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 29, Number 7(2) (2017)
Copyright(C) MYU K.K.
pp. 1061-1067
S&M1399 Research Paper of Special Issue
Published: July 26, 2017

Haptic Telepresence System for Individuals with Visual Impairments [PDF]

Yeongil Ryu and Eun-Seok Ryu

(Received April 1, 2016; Accepted May 15, 2017)

Keywords: haptic telepresence, individuals with visual impairments, video streaming

In this paper, we propose a new haptic telepresence system for individuals with visual impairments (VIs) using a red-green-blue-depth (RGB-D) sensor and a haptic device. Recent improvements in RGB-D sensors have enabled real-time access to 3D spatial information. However, the real-time representation of the tangible haptic experience has not been sufficiently challenged. Thus, the proposed system addresses the telepresence of remote 3D information using an RGB-D sensor through video encoding and 3D depth-map enhancement. In the implemented system, the Kinect sensor from Microsoft is an RGB-D sensor that provides depth and 2D color images at a rate of approximately 30 fps. The Kinect depth data frame is buffered, projected into a 3D coordinate system with a resolution of 640 × 480 pixels, and then is transformed into a 3D map. To verify the benefits of the proposed video content adaptation method for individuals with VIs, in this study, we implemented a ‘2D plus depth map’-based haptic telepresence system; it conducts user experience experiments and presents the results of user response time.

Corresponding author: Eun-Seok Ryu

Cite this article
Yeongil Ryu and Eun-Seok Ryu, Haptic Telepresence System for Individuals with Visual Impairments, Sens. Mater., Vol. 29, No. 7, 2017, p. 1061-1067.

Forthcoming Regular Issues

Forthcoming Special Issues

Applications of Novel Sensors and Related Technologies for Internet of Things
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Cheng-Fu Yang (National University of Kaohsiung)
Call for paper

Special Issue on Advanced Data Sensing and Processing Technologies for Smart Community and Smart Life
Guest editor, Tatsuya Yamazaki (Niigata University)
Call for paper

Special Issue on Advanced Sensing Technologies and Their Applications in Human/Animal Activity Recognition and Behavior Understanding
Guest editor, Kaori Fujinami (Tokyo University of Agriculture and Technology)
Call for paper

Special Issue on International Conference on Biosensors, Bioelectronics, Biomedical Devices, BioMEMS/NEMS and Applications 2023 (Bio4Apps 2023)
Guest editor, Dzung Viet Dao (Griffith University) and Cong Thanh Nguyen (Griffith University)
Conference website
Call for paper

Special Issue on Piezoelectric Thin Films and Piezoelectric MEMS
Guest editor, Isaku Kanno (Kobe University)
Call for paper

Special Issue on Advanced Micro/Nanomaterials for Various Sensor Applications (Selected Papers from ICASI 2024)
Guest editor, Sheng-Joue Young (National United University)
Conference website
Call for paper

Copyright(C) MYU K.K. All Rights Reserved.