Sensor Fusion for Drone Position and Attitude Estimation using Extended Kalman Filter
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Accurate real-time estimation of a drone’s position, velocity, and orientation is crucial for stable navigation, path tracking, and autonomous control. However, achieving robust state estimation is challenging due to sensor noise, environmental disturbances, and the nonlinear nature of drone dynamics. This project investigates the use of an Extended Kalman Filter (EKF) to fuse data from multiple sensors—specifically, GPS, an Inertial Measurement Unit (IMU), and a barometric altimeter—to estimate the full 9-state vector of a drone in various motion scenarios. A high-fidelity simulation environment is developed in MATLAB Simulink, incorporating a physics-based six degrees-of-freedom (6DOF) drone model. Sensor measurements are generated by simulating realistic sampling rates and noise characteristics for each sensor type. The EKF is implemented to fuse GPS-provided position and velocity, IMU-derived acceleration and angular velocity, and barometric altitude into a unified state estimate. The effectiveness of the filter is validated using different trajectory profiles—including linear, sinusoidal, and linearly increasing motions. Evaluation metrics such as Root Mean Square Error (RMSE) are used to quantify estimation performance. The results confirm that the EKF significantly reduces sensor noise and drift, resulting in reliable full-state estimation even in complex dynamic conditions.