Integrated deep learning and geo-referencing for drone-based animal tracking with flexible camera angles

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

  • The ability of drones to provide detailed information on animals and their surroundings makes them ideal for studying animal behaviour at fine scales. While drones can provide high-resolution images of what animals are doing, they should also, in theory, be able to provide data on where they are. However, reconstructing geo-referenced tracks from drone videos that follow animals is challenging, particularly because current methods require specific drone flight patterns and large computational power.

  • Here, we combine deep learning and object tracking methods with a novel geo-referencing algorithm which allows us to track individuals across video frames and reconstruct their geo-referenced trajectories. We used a Region-based Convoluted Neural Network to detect animals and a Hungarian tracking algorithm to link detections across video frames, and then geo-referenced each detection in every frame to reconstruct individual trajectories.

  • We tested our geo-refencing algorithm through multiple drone flights with varying flight parameters over known Ground Control Points. The median (95% CI) geo-referencing error was 2.81 (0.74 – 23.23) meters, which reduced by 50% when the drone camera was positioned between -90° and -40°. Error increased with drone height and camera angle (-90° refers to the camera pointing towards the ground) but was not impacted by drone orientation.

  • We then demonstrate the utility of our framework with empirical examples using consumer-level drones. First, we tracked a volunteer carrying a high-resolution GPS unit and overlayed their GPS tracks on our estimated tracks to quantify tracking error. Next, we used drone videos of two delphinid species ( Tursiops truncates gephyreus , and Sousa plumbea ) representing varying environmental and flight conditions. We were able to successfully infer individual tracks across all conditions except when individuals formed tight clusters, in which case tracks were assigned a group identifier.

  • Our framework demonstrates an easy and robust approach to translate drone videos of moving animals into geo-referenced animal tracks which is applicable in many research contexts. A major advance over previous methods is that our algorithm is robust to different camera angles, and provides tracks with accuracy on-par with, or even exceeding, the accuracy from GPS tracking.

  • Headline

    Automated drone-based animal tracking

    Article activity feed