Vision-Only Localization of Drones with Optimal Window Velocity Fusion

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Drone localization is essential for various purposes such as navigation, autonomous flight, and object tracking. However, this task is challenging when satellite signals are unavailable. This paper addresses database-free vision-only localization of flying drones using optimal window template matching and velocity fusion. Assuming the ground is flat, multiple optimal windows are derived from a piecewise linear segment (regression) model of the image-to-real world conversion function. The optimal window is used as a fixed region template to estimate the instantaneous velocity of the drone. The multiple velocities obtained from multiple optimal windows are integrated by a hybrid fusion rule: a weighted average for lateral (sideways) velocities, and a winner-take-all decision for longitudinal velocities. In the experiments, a drone performed a total of six medium-range (800 m to 2 km round trip) and high-speed (up to 14 m/s) maneuvering flights in rural and urban areas. The flight maneuvers include forward-backward, zigzags, and banked turns. Performance was evaluated by root mean squared error (RMSE) and drift error of the GNSS-derived ground-truth trajectories and rigid-body rotated vision-only trajectories. Four fusion rules (simple average, weighted average, winner-take-all, hybrid fusion) were evaluated, and the hybrid fusion rule performed the best. The proposed video stream-based method has been shown to achieve flight errors ranging from a few meters to tens of meters, which corresponds to a few percent of the flight length.

Article activity feed