A Blended Extended Kalman Filter Approach for Enhanced AGV Localization in Centralized Camera-Based Control Systems

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This research presents a study on enhancing the localization and orientation accuracy of indoor Autonomous Guided Vehicles (AGVs) operating under a centralized, camera-based control system. We investigate and compare the performance of two Extended Kalman Filter (EKF) configurations: a standard EKF and a novel Blended EKF. The research methodology comprises four primary stages: (1) Sensor bias correction for the camera (CAM), Dead Reckoning, and Inertial Measurement Unit (IMU) to improve raw data quality; (2) Calculation of sensor weights using the Inverse-Variance Weighting principle, which assigns higher confidence to sensors with lower variance; (3) Multi-sensor data fusion to generate a stable state estimation that closely approximates the ground truth (GT); and (4) A comparative performance evaluation between the standard EKF, which processes sensor updates independently, and the Blended EKF, which fuses CAM and DR measurements prior to the filter’s update step. Experimental results demonstrate that the implementation of bias correction and inverse-variance weighting significantly reduces the Root Mean Square Error (RMSE) across all sensors. Furthermore, the Blended EKF not only achieved a lower RMSE in certain scenarios but also produced a notably smoother trajectory compared to the standard EKF. These findings indicate the significant potential of the proposed approach in developing more accurate and robust navigation systems for AGVs in complex indoor environments.

Article activity feed