E²R-MamTrack: An Edge-Efficient and Robust Mamba-Based ObjectDetection Framework for Real-Time Logistics Environments
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
eal time object detection in automated logistics environments demands high accuracy, low latency, and robustness under challenging visual conditions such as dense object distributions, scale variation, occlusion, and illumination degradation. Recently, Mamba based vision models have emerged as an efficient alternative to Transformer architectures by enabling long range dependency modeling with linear computational complexity. However, existing Mamba based object detectors remain limited by high hardware dependency, reduced robustness in adverse conditions, limited generalization across domains, and lack of deployment oriented adaptability. To address these limitations, this paper proposes E²R-MamTrack, an Edge Efficient and Robust Mamba based Object Detection Framework for logistics scenarios. The proposed methodology introduces four novel components: An Edge Aware Adaptive Backbone with dynamic channel scaling, A Robust Visual Enhancement Module for low quality inputs, A Dynamic Confidence Guided Early Exit mechanism for adaptive inference, and a Continual Domain Adaptation Head for lifelong learning. Mathematical formulations and algorithmic strategies are developed to jointly optimize efficiency, robustness, and adaptability. The proposed framework aims to significantly reduce average inference cost while maintaining high detection accuracy, enabling practical deployment in large scale and resource constrained logistics systems.