pp. 1497-1509
S&M2197 Research Paper https://doi.org/10.18494/SAM.2020.2646 Published: April 30, 2020 Context-Aware Assistive Indoor Navigation of Visually Impaired Persons [PDF] Chathurika S. Silva and Prasad Wimalaratne (Received October 3, 2019; Accepted March 17, 2020) Keywords: sensor fusion, wearable sensors, context awareness, assistive technology, human–computer interaction, mobile–cloud computing
This paper presents an approach for context awareness in navigation for visually impaired
persons via sensor-based obstacle detection, obstacle recognition, sensor fusion, and walking
context analysis. Sonar and vision sensor data are fused using a complementary sensor fusion
approach. A wearable belt has sonar and vision sensors that detect and recognize obstacles,
respectively. A fuzzy logic model is used for safety aspect handling during visually impaired
navigation. Walking context analysis handles decisions on the current walking status by using
clues acquired from the smartphone application and obstacle detection process. Feedback is
provided via audio and tactile cues. The usability evaluation experiment using the proof-ofconcept
reveals positive results and other areas of investigation have been identified.
Corresponding author: Chathurika S. SilvaThis work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Chathurika S. Silva and Prasad Wimalaratne, Context-Aware Assistive Indoor Navigation of Visually Impaired Persons, Sens. Mater., Vol. 32, No. 4, 2020, p. 1497-1509. |