Robust 3D Localization for UAV Navigation Using Event Cameras and IMU for Indoor Environments

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

For proper navigation of Unmanned Aerial Vehicles (UAVs), it is necessary to know their position in real-time to ensure safe navigation. Determining position in outdoor spaces is quite well solved. On the other hand, in indoor spaces, existing solutions are either imprecise or excessively costly. In this paper, the 3D localization problem is addressed in the context of UAV navigation. The main purpose of this work is to develop and evaluate a robust real-time localization scheme using exclusively the information from an embedded Event Camera and an IMU (Inertial Measurement Unit). Deep learning techniques and robust computer vision algorithms are implemented together to accurately compute the UAV pose, leveraging the strengths of well-established visual-inertial odometry algorithms and the intrinsic advantages of Event Cameras, such as high dynamic range and absence of motion blur. Throughout this study, state-of-the-art techniques are selected, refined, implemented, and evaluated. The proposed system demonstrated good performance and acceptable precision specially in situation with abrupt lighting changes.

Article activity feed