Graph-based Extraction and Tracking of Insect Motion in Event-Camera Data Streams

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Monitoring pollinator populations requires accurate reconstruction of individual insect flight trajectories from continuous sensor data. Event-based vision sensors capture motion at microsecond-resolution, producing sparse data streams well suited to detecting fast-moving insects. We present a two-stage pipeline that extracts flight trajectories from event camera recordings. In the first stage, a graph neural network (GNN) trained entirely on synthetic data performs semantic segmentation to separate insect-triggered events from background activity, generalizing to real-world recordings with an F1 score of 0.95. In the second stage, graph-based instance segmentation reconstructs individual trajectories from the segmented events through iterative splitting of over-merged paths and re-merging of fragmented tracks. With post-processing parameters tuned via Bayesian optimization, the full pipeline achieves a mean average precision (mAP) of 0.60 and mean average recall (mAR) of 0.79 on real-world data, up from 0.17 and 0.56 before refinement, while preserving accurate insect counts. These results demonstrate that event-based cameras combined with deep graph learning enable scalable, high-resolution monitoring of insect activity in natural environments.

Article activity feed