LSTM-H: A Hybrid Deep Learning Model for Accurate Livestock Movement Prediction in UAV-Based Monitoring Systems
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Accurately predicting livestock movement is a cornerstone of precision agriculture and UAV-based livestock monitoring, enabling smarter resource management, improved animal welfare, and enhanced productivity. However, the unpredictable and dynamic nature of livestock behavior poses significant challenges for traditional mobility prediction models. This study introduces LSTM-H, a hybrid deep learning model that combines the sequential learning power of Long Short-Term Memory (LSTM) networks with the real-time correction capabilities of Kalman Filters (KF) to enhance livestock movement prediction within UAV-based monitoring frameworks. The results demonstrate that LSTM-H achieves a mean error of just 11.51 meters for the first step and 40.68 meters over a 30-step prediction horizon, outperforming state-of-the-art models by 4.3x to 14.8x. By bridging deep learning and adaptive filtering, LSTM-H not only enhances prediction accuracy but also paves the way for scalable, real-time livestock and UAV monitoring systems with transformative potential for precision agriculture.