TY - JOUR
T1 - 3D vision-based handheld system for visually impaired people
T2 - Preliminary results on echo-localization using structured light sensors
AU - Baez, Gonzalo
AU - Prieto, Pablo
AU - Auat Cheein, Fernando A.
N1 - Publisher Copyright:
© 2018 IOP Publishing Ltd.
PY - 2018/7
Y1 - 2018/7
N2 - Echo-localization in visually impaired people plays a crucial role in obstacle avoidance as long as the obstacle -either an object or a person-makes a sufficiently audible sound. Otherwise, the obstacle will be invisible for the visually impaired person unless it could be detected via, e.g., a white cane. Artificial vision systems, combined with a sound-based device, have proven to be effective in enhancing independence and mobility of visually impaired users in daily tasks. In this work, we propose, build and test an interface that converts depth information acquired by a 3D vision system, into 3D sound using the head related transfer function (HRTF) transform. Thus our system registers the environment and reproduce the nearest objects to the user with a distinctive tone and volume, according to the distance, and its position and orientation from the vision systems point of view, in real time in the head of the person. In addition, our system can be benefited from the integration of previously developed approaches, such as objects, color and face recognition to thus further improve the quality of life of visually impaired people. We test our system in a population of seven volunteers, showing an encouraging exponential learning behaviour when facing two main issues: crossing doors and navigating in crowded environments.
AB - Echo-localization in visually impaired people plays a crucial role in obstacle avoidance as long as the obstacle -either an object or a person-makes a sufficiently audible sound. Otherwise, the obstacle will be invisible for the visually impaired person unless it could be detected via, e.g., a white cane. Artificial vision systems, combined with a sound-based device, have proven to be effective in enhancing independence and mobility of visually impaired users in daily tasks. In this work, we propose, build and test an interface that converts depth information acquired by a 3D vision system, into 3D sound using the head related transfer function (HRTF) transform. Thus our system registers the environment and reproduce the nearest objects to the user with a distinctive tone and volume, according to the distance, and its position and orientation from the vision systems point of view, in real time in the head of the person. In addition, our system can be benefited from the integration of previously developed approaches, such as objects, color and face recognition to thus further improve the quality of life of visually impaired people. We test our system in a population of seven volunteers, showing an encouraging exponential learning behaviour when facing two main issues: crossing doors and navigating in crowded environments.
KW - echo-localization
KW - light structured sensors
KW - visually impaired people
UR - http://www.scopus.com/inward/record.url?scp=85053105102&partnerID=8YFLogxK
U2 - 10.1088/2057-1976/aac9ad
DO - 10.1088/2057-1976/aac9ad
M3 - Article
AN - SCOPUS:85053105102
SN - 2057-1976
VL - 4
JO - Biomedical Physics and Engineering Express
JF - Biomedical Physics and Engineering Express
IS - 4
M1 - 047006
ER -