Deep learning cross-applicability of Uncrewed Aircraft System (UAS) imagery from different disaster types for building damage assessment
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
In recent years, the frequency and intensity of natural disasters have increased worldwide. These disasters cause significant economic losses, with building damage accounting for a substantial portion. Post disaster response plans involve inventories of building damage. However, these assessments can be highly subjective, require substantial time, and can expose the inspectors to unsafe environments. Automation of building damage assessment by applying deep learning combined with advanced remote sensing technology is currently an active research topic. These efforts are hindered by the limited amount of high-quality training datasets for each disaster type (e.g., hurricane, wildfire). Buildings damaged by different disasters may show distinct damage patterns due to differing damage mechanisms, posing challenges to data integration across disaster types and model development. To investigate these issues, this study explores the interrelationship between wildfire and hurricane data by developing models suited to wildfire and hurricane datasets both individually and jointly as well as combining various backbones and deep learning models. Our approach includes semantic segmentation for pixel-level damage assessment and analyzing model sensitivity with different amounts of training data. Ultimately, this study provides a solution to the limited data available to train building damage assessment deep learning models by providing a comparative analysis of the inter-applicability of wildfire and hurricane data. A notable finding is that when using a small portion of data through transfer learning, data and deep learning models from the other disaster types can be leveraged.