Real-Time Driver State Detection Using mmWave Radar: A Spatiotemporal Fusion Network for Behavior and Physiological Monitoring on Edge Platforms

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Fatigue and distracted driving are among the leading causes of traffic accidents, highlighting the importance of developing efficient and non-intrusive driver monitoring systems. Traditional camera-based methods are often limited by lighting variations, occlusions, and privacy concerns. In contrast, millimeter-wave radar offers a non-contact, privacy-preserving, and environment-robust solution, providing a forward-looking alternative. This study introduces a novel deep learning model, RTSFN (Radar-based Temporal-Spatial Fusion Network), which simultaneously analyzes the temporal motion changes and spatial posture features of the driver. RTSFN incorporates a cross-gated fusion mechanism that dynamically integrates multi-modal information, enhancing feature complementarity and stabilizing behavior recognition. Additionally, the system integrates a proprietary radar signal processing pipeline, with its physiological signal module having adaptive target selection capabilities. It dynamically selects the optimal sensing area based on the driver’s position and signal quality to enhance the capture of micro-movements. This module not only estimates physiological indicators such as heart rate and respiration but also significantly improves the overall accuracy and reliability of driver state assessment. Experimental results show that RTSFN achieves over 94% accuracy in detecting high-risk driving behaviors and can run in real-time on edge devices such as the NVIDIA Jetson Orin Nano, demonstrating its strong potential for deployment in intelligent transportation and in-vehicle safety systems.

Article activity feed