Non-Contact Fall Detection System Using 4D Imaging Radar for Elderly Safety Based on a CNN Model

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Progressive global aging has increased the number of elderly individuals living alone. The consequent rise in fall accidents has worsened physical injuries, reduced the quality of life, and increased medical expenses. Existing wearable fall-detection devices may cause discomfort, and camera-based systems raise privacy concerns. Here, we propose a non-contact fall-detection system that integrates 4D imaging radar sensors with artificial intelligence (AI) technology to detect falls through real-time monitoring and visualization using a web-based dashboard and Unity engine-based avatar, along with immediate alerts. The system eliminates the need for uncomfortable wearable devices and mitigates the privacy issues associated with cameras. The radar sensors generate Point Cloud data (the spatial coordinates, velocity, Doppler power, and time), which allow analysis of the body position and movement. A CNN model classifies postures into standing, sitting, and lying, while changes in the speed and position distinguish falling actions from lying-down actions. The Point Cloud data were normalized and organized using zero padding and k-means clustering to improve the learning efficiency. The model achieved 98.66% accuracy in posture classification and 95% in fall detection. This study demonstrates the effectiveness of the proposed fall detection approach and suggests future directions in multi-sensor integration for indoor applications.

Article activity feed