Sensor Fusion Localization and Navigation for Visually Impaired People

Daniele Croce, Giovanni Ettore Galioto, Ilenia Tinnirello, Inderst, Federica Pascucci, Laura Giarre

Research output: Contribution to conferenceOtherpeer-review

5 Citations (Scopus)

Abstract

In this paper, we present an innovative cyber physical system for indoor and outdoor localization and navigation, based on the joint utilization of dead-reckoning and computer vision techniques on a smartphone-centric tracking system. The system is explicitly designed for visually impaired people, but it can be easily generalized to other users, and it is built under the assumption that special reference signals, such as colored tapes, painted lines, or tactile paving, are deployed in the environment for guiding visually impaired users along pre-defined paths. Differently from previous works on localization, which are focused only on the utilization of inertial sensors integrated into the smartphones, we exploit the smartphone camera as an additional sensor that, on one side, can help the visually impaired user to identify the paths and, on the other side, can provide direction estimates to the tracking system. We demonstrate the effectiveness of our approach, by means of experimental tests performed in a real outdoor installation and in a controlled indoor environment.
Original languageEnglish
Pages3191-3196
Number of pages6
Publication statusPublished - 2018

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Control and Optimization

Fingerprint Dive into the research topics of 'Sensor Fusion Localization and Navigation for Visually Impaired People'. Together they form a unique fingerprint.

Cite this