AI- and IoT-Integrated Framework for Intelligent Sensing and Accessibility in Smart Transportation Systems
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
This paper presents an intelligent, sensor-driven framework that integrates emerging technologies to deliver smart, reliable, and accessible transit assistance for visually impaired people (VIP). The proposed system leverages Internet of Things (IoT), Internet of Devices (IoD), GPS, and crowdsensing to collect multimodal data, comprising audio, video, and environmental signals, used to characterize and respond to users’ real-time mobility needs. A cloud-based architecture performs centralized data fusion and decision-making using artificial intelligence (AI) and machine learning (ML) algorithms, enabling rapid interpretation of sensor inputs and generation of personalized navigation guidance. The framework is implemented through a mobile application that coordinates data exchange between edge devices and cloud services, providing context-aware navigation, obstacle alerts, and two-way communication with transit operators. Unlike existing assistive mobility solutions, which rely primarily on static location services and lack cross-sensor integration, the proposed system introduces a unified AI-enabled sensing layer that supports dynamic adaptation to complex urban environments. The results demonstrate the framework’s potential to enhance autonomy, safety, and situational awareness for VIPs, offering a scalable foundation for inclusive smart transportation systems.