E2ETrADS: End-to-End Transformer Based Autonomous Driving System for Adverse Weather Conditions

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Adverse weather conditions, such as snow,heavy rain, fog or limited illumination create significant challenges for autonomous vehicles (AVs) by diminishing the reliability of their sensors. This paper proposes a Transformer-based navigation strategy that could help autonomous vehicles navigate better in adverse weather conditions. The CARLA (Car Learning to Act) simulation tool makes a dataset by changing the environment to make it seem like visibility is lower and sensors don't work as well. The dataset that was created includes sensor data from multiple sources that is affected by changes in the weather. This includes LiDAR point clouds, depth maps, and pictures. We suggest using a deep learning model with transformer-based architectures to combine data from different sensors to help with decision-making. Using imitation learning, the model is trained using the control actions of the hybrid MPC-PID controller. While the weather-adaptive MPC optimizes control instructions using an environmental risk-aware cost structure, its PID component manages low-level actuation. According to the findings, sensor fusion can greatly improve the resilience of autonomous driving systems in inclement weather, especially when combined with transformer models. The suggested model was found to operate better in fog, rain, and low-visibility situations than the Transfuser baseline and to have far fewer violations.

Article activity feed