Demo: Sensor fusion localization and navigation for visually impaired people

Laura Giarre, Daniele Croce, Giovanni Ettore Galioto, Ilenia Tinnirello, Federica Inderst, Federica Pascucci, Laura Giarré

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)


We present an innovative smartphone-centric tracking system for indoor and outdoor environments, based on the joint utilization of dead-reckoning and computer vision (CV) techniques. The system is explicitly designed for visually impaired people (although it could be easily generalized to other users) and it is built under the assumption that special reference signals, such as painted lines, colored tapes or tactile pavings are deployed in the environment for guiding visually impaired users along pre-defined paths. Thanks to highly optimized software, we are able to execute the CV and sensor-fusion algorithms in run-time on low power hardware such as a normal smartphone, precisely tracking the users movements.
Original languageEnglish
Title of host publicationProceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM
Number of pages3
Publication statusPublished - 2017

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Hardware and Architecture
  • Software

Fingerprint Dive into the research topics of 'Demo: Sensor fusion localization and navigation for visually impaired people'. Together they form a unique fingerprint.

Cite this