A Wearable Sensors Based Elderly Activity Recognition Model Using Deep Learning
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Human Activity Recognition (HAR) refers to automatic recognition of different human physical activities including walking, sitting, standing etc. Elderly people are more likely to fall and may cause serious injuries. As a result, HAR plays an important role in improving their safety, health, and quality of life. This involves keeping an eye on their every move, from sitting to sleeping, making sure they're okay. This facilitates rehabilitation monitoring to see if patients are adhering to treatment regimens, as well as health monitoring, which keeps track of activity levels to detect anomalies that may indicate health problems. Most elderly people live alone or in care homes. Hence if they are regularly checked, aid can be offered to maintain health and safety while living freely. This is achieved with the aid of sensors including accelerometers, gyroscopes, smartphones, and different wearable devices, combined with artificial intelligence techniques. In this paper, we propose a novel model for elderly people’s activities recognition based on wearable sensors. The proposed model utilized the convolutional neural network (CNN) for feature extraction followed by bi-directional long-short term memory (Bi-LSTM) for bi-directional sequential analysis and a fully connected layer for classifying the different types of activities. The model has been trained and evaluated using three publicly available data sets which are MHealth, PAMAP2 and WISDM datasets. The proposed model scored accuracy at 99.3%, 97.9% and 97.2% for the three datasets, respectively. Moreover, the proposed model outperformed several of the state-of-the-art proposals by up to 34.9% in terms of accuracy.