Assisted Navigation for Visually Impaired People Using 3D Audio and Stereoscopic Cameras

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper presents the development and evaluation of an initial, comprehensive assistive navigation prototype that integrates three-dimensional audio spatialization with advanced computer vision technologies to enhance the mobility of visually impaired individuals. The system combines stereoscopic depth perception, real-time point cloud reconstruction, and object detection capabilities via a modified YOLO convolutional neural network architecture. It also implements auralization techniques using head-related impulse response functions to generate directional audio cues. Twenty participants, including ten individuals with visual impairments, navigated through controlled obstacle scenarios while wearing the chest-mounted camera system and specialized headphones for experimental validation. Remarkably, the prototype demonstrated computational efficiency, processing visual information at 0.042489 seconds per image and exceeding real-time performance requirements for practical navigation applications. The system achieved 95.00% object classification precision across eleven obstacle categories, successfully identifying common urban navigation hazards, including vehicles, pedestrians, and infrastructure elements. Participants completed navigation tasks with an average collision rate of 0.5 per scenario and a mean completion time of 48 seconds, demonstrating measurable improvements in spatial awareness and obstacle avoidance. Integrating segmented convolution-based audio processing with stereoscopic depth estimation proved highly effective. This integration enables users to perceive obstacle locations through intuitive spatial sound cues, eliminating the need for extensive training, that can serve as a foundation for ongoing efforts to advance assistive navigation technologies.

Article activity feed