YOLOv9-SEDA: A Lightweight Object Detection Framework for Precision Pesticide Spraying in Orchard Environments
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Precision detection of orchard tree canopies and non-target areas is critical for minimizing chemical overuse and enhancing the sustainability of smart agricultural systems. Pesticide application methods often result in excessive agrochemical application and environmental degradation. To address these challenges, this study proposes a real-time intelligent orchard spraying system based on an improved YOLOv9-SEDA deep learning architecture, optimized for deployment on edge devices. The model integrates depthwise separable convolutions to reduce computational overhead, Efficient Channel Attention (ECA) for enhanced feature representation, and a Lookahead optimizer combined with AdamW to improve training stability and convergence. Additionally, the Swish activation function is employed to enhance learning efficiency and nonlinearity. The system integrates real-time visual perception with intelligent control logic to dynamically adjust spray patterns based on canopy presence, reducing unnecessary application in sparse or non-target areas. Field experiments conducted with a structured-light depth camera and a Jetson Xavier NX-based autonomous spraying robot demonstrate the system’s real-time performance and operational viability. YOLOv9-SEDA achieves a precision of 89.5%, recall of 91.1%, mAP@0.5 of 94.2%, and mAP@0.5:0.95 of 84.6%, outperforming state-of-the-art detectors including YOLOv9, YOLOv5, YOLOv7, ATSS, and RetinaNet. Controlled trials reveal a 20.75% reduction in pesticide consumption and a 97.91% decrease in spray wastage. These findings underscore the potential of deep learning-enabled, resource-efficient vision systems for real-time control in industrial informatics and precision agriculture.