YOLO-UFS: A Novel Detection Model for UAVs to Detect Early Forest Fires

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Forest fires endanger ecosystems and human life, making early detection crucial for effective prevention. Traditional detection methods are often inadequate due to large coverage areas and inherent limitations. However, drone technology combined with deep learning holds promise. This study investigates using small drones equipped with lightweight deep learning models to detect forest fires early. A high-quality dataset constructed through aerial image analysis supports robust model training. The proposed YOLO-UFS network, based on YOLOv5s, integrates enhancements such as the C3-MNV4 module, BiFPN, AF-IoU loss function, and NAM attention mechanism. These modifications achieve a 91.3% mAP on the self-built early forest fire dataset. Compared to the original model, YOLO-UFS improves accuracy by 3.8%, recall by 4.1%, and average accuracy by 3.2%, while reducing computational parameters by 74.7% and 78.3%. It outperforms other mainstream YOLO algorithms on drone platforms, balancing accuracy and real-time performance. In generalization experiments using public datasets, the model’s mAP0.5 increased from 85.2% to 86.3%, and mAP0.5:0.95 from 56.7% to 57.9%, with an overall mAP gain of 3.3%. The optimized model runs efficiently on the Jetson Nano platform with 258 GB of RAM, 7.4 MB of storage memory, and an average frame rate of 30 FPS. In this study, airborne visible light images are used to provide a low-cost and high-precision solution for the early detection of forest fires, so that low-computing UAVs can achieve the requirements of early detection, early mobilization, and early extinguishment. Future work will focus on multi-sensor data fusion and human–robot collaboration to further improve the accuracy and reliability of detection.

Article activity feed