Segmentation-Guided Optical Flow Integration for Ship Tracking in Dynamic Waters

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Multi-object ship tracking in maritime environments remains challenging due to dynamic water surfaces, camera drift, and low target visibility. Conventional tracking-by-detection pipelines—such as YOLO combined with DeepSORT—often degrade under these conditions, leading to fragmented trajectories and unstable identity association. In this work, we propose a segmentation-guided, flow-aware tracking framework that enhances YOLOv11-based detection and segmentation with motion cues derived from optical flow. The proposed method has three components: (1) segmentation-guided sparse KLT flow, extracting motion only within ship masks to suppress wave/background clutter; (2) camera-motion compensation via median background-flow subtraction; and (3) a flow-aware association cost inside DeepSORT that fuses appearance similarity with motion consistency (with reliability-gated weighting). Together, these additions improve temporal stability without modifying the tracker’s underlying state model. We evaluate the approach on a curated set of maritime videos spanning diverse conditions, including heavy traffic, fog, night scenes, and empty-water cases. Experimental results based on stability-oriented proxy metrics and per-video MOT analyses demonstrate that the proposed method substantially reduces track fragmentation and improves visual consistency compared to the legacy DeepSORT baseline, with additional MOTA gains observed in challenging crowded and camera-motion scenarios, at a moderate computational overhead. These findings highlight the effectiveness of segmentation-guided motion cues for robust ship tracking in real-world maritime surveillance scenarios.

Article activity feed