Mapping California woodland-chaparral ecosystems following wildfire with diverse drone images and computer vision

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Fire is a key driver of vegetation dynamics in California's woodland-chaparral ecosystems, and its role has become ever more important in recent decades as wildfire extents and frequencies increase. Understanding post-fire vegetation transitions and the likelihood of type conversion is essential for effective land management. Remote sensing represents a powerful tool to map vegetation cover and study post-fire dynamics, but current approaches—generally based on satellite sensors—are limited by their spatial and temporal resolution and generally broad application extents. In contrast, uncrewed aerial vehicles, or “drones,” offer great potential to yield low-cost, high-resolution, locally-tailored data on vegetation cover and its variation across time and space. With recent rapid development of technologies for translating raw drone imagery into ecologically relevant data, the power of drone-based research is increasing along with its analytical decision space. In this work, we apply modern methods in image processing and computer vision to generate vegetation maps from a large and diverse dataset of drone images collected under realistic operational constraints. Specifically, our imagery was collected at three study sites across three years, by multiple pilots flying different drone models with varying flight parameters. Our analytical approach uses an automated method to spatially co-register all overlapping datasets into a common reference frame. We then generate vegetation predictions within each raw image using a computer vision model and translate image-level predictions to a geospatial map based on the known positions of the drone camera. Finally, we unify all geospatial predictions from similar dates into a best available prediction for each location. Using this merged representation, we conduct change analysis across years for the landscape area common between years—approximately 100 ha at each of two study sites. When predicting our eight vegetation classes on unseen images, we achieved 94% overall accuracy and 88% class-balanced accuracy. Change analysis yielded surprisingly little change over 3-4 years post-fire, with key changes being shrub (re)establishment and tree resprouting. Our findings demonstrate the viability of scalable drone-based approaches for tracking vegetation change in fire-prone landscapes.

Article activity feed