Relative Positioning Algorithm for Drone Swarms under GPS-denied Conditions

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Drone swarms hold great potential for various applications. To enable the operation of drone swarms, the primary challenge is to resolve the relative positioning of drones within the swarm. A reliable method for relative position sensing is essential for maintaining formation. Currently, the mainstream approach to swarm position sensing involves GPS and communication-based position sharing techniques. However, GPS is susceptible to interference and has poor accuracy in certain environments. To address this issue, this paper proposes an autonomous relative positioning method for drone swarms based on monocular vision. This method achieves the computation of the three-dimensional position in the target camera coordinate system by combining target orientation information and distance information.Target orientation information is obtained through a target detection algorithm combined with monocular camera principles. To address the issue of insufficient real-time performance of the YOLOv5 algorithm, advanced lightweight architectures and attention mechanisms are integrated, ensuring fast detection speed while maintaining accuracy. The introduction of DeepSORT enables continuous confirmation of target orientation. Target distance information is derived using target detection in conjunction with monocular ranging principles. Experimental results indicate that, in dynamic validation experiments, the algorithm achieves average errors of 5.2%, 4.7%, and 3.8% in the X, Y, and Z axes, respectively, demonstrating high positioning accuracy.

Article activity feed