Dynamic Attitude Estimation Method Based on LSTM-Enhanced Extended Kalman Filter

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Visual–inertial attitude estimation systems often suffer from accuracy degradation and instability when visual measurements are intermittently lost due to occlusion or illumination changes. To address this issue, this paper proposes an LSTM-EKF framework for dynamic attitude estimation under visual information loss. In the proposed method, an LSTM-based vision prediction network is designed to learn the temporal evolution of visual attitude measurements and to provide reliable pseudo-observations when camera data are unavailable, thereby maintaining continuous EKF updates. The algorithm is validated through turntable experiments, including long-term reciprocating rotation tests, continuous visual occlusion scanning experiments, and attitude accuracy evaluation experiments over an extended angular range. Experimental results show that the proposed LSTM-EKF effectively suppresses IMU error accumulation during visual outages and achieves lower RMSE compared with conventional EKF and AKF methods. In particular, the LSTM-EKF maintains stable estimation performance under a certain degree of visual occlusion and extends the effective attitude measurement range beyond the camera’s observable limits. These results demonstrate that the proposed method improves robustness and accuracy of visual–inertial attitude estimation in environments with intermittent visual degradation.

Article activity feed