Multispecies weed mapping using deep learning on UAV imagery for SSWM in maize and tomato

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Accurate identification and mapping of multiple weed species at early growth stages is a critical step toward operational site-specific weed management (SSWM), yet most UAV-based studies have so far been limited to broad weed categories or single dominant species. This study evaluated the potential of deep learning models, including three convolutional neural networks (Inception-ResNet-v2, EfficientNet-B0, YOLOv8) and two Vision Transformers (ViT-Base, Swin-T), to classify, detect and map nine common weed species in maize and tomato fields using UAV-based RGB imagery. The two best-performing classifiers were then implemented in object detection frameworks (YOLOv8m and DETA), and species-specific treatment maps were generated using adaptive economic weed thresholds applied to gridded density weed data. Classification results showed that Swin-T and YOLOv8 achieved the highest metrics, with weighted F1-scores of 98.1% and 97.0%, respectively. Next, the YOLOv8m provided the most accurate and efficient detection, with a mean Average Precision of 0.93 and recall of 0.94, while substantially reducing inference time. The multispecies treatment maps revealed over 70% of weed-free areas, indicating the potential benefits of cost-saving approaches compared to uniform full-field treatments, providing valuable inputs for decision support systems and smart sprayers to gradually advance SSWM for a more selective, efficient and sustainable weed control.

Article activity feed