Certifiable Transformer-Based Sensor Fusion Architecture for Urban Air Mobility
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Urban Air Mobility (UAM) vehicles rely on robust multi-sensor perception for safe navigation in complex environments. This paper presents a novel sensor fusion architecture using a Transformer-based model for integrating LiDAR, Electro-Optical/Infrared (EO/IR) cameras, GNSS, ADS-B, IMU, and radar data. We detail a hardware-software co-design for real-time embedded deployment, emphasizing compliance with DO-178C(software) and DO-254 (hardware) certifiability. A mathematical formulation of the fusion algorithm is provided, leveraging cross-attention to achieve multimodal state estimation. We simulate an urban canyon scenario with multiple UAM vehicles (using ROS2/Gazebo and MATLAB) to evaluate performance. Results demonstrate high accuracy, low latency, and stable confidence intervals, even under sensor degradation or GNSS loss. Comparative analysis shows our Transformer-based fusion outperforms legacy Extended Kalman Filter and earlier deep models (including a BART-based approach) in both accuracy and robustness. We also discuss how the design handles adversarial sensor inputs and degrades gracefully. The proposed architecture, supported by certifiable development practices and a safety monitor subsystem, offers a viable path toward certification in UAM.