A Survey of Multimodal Data Fusion in Earth Observation-Remote Sensing
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Multimodal data fusion has emerged as a pivotal technique in Earth Observation–Remote Sensing (EO-RS) analysis as it enables the integration of heterogeneous data to enhance geo-analysis of the living environment. This survey comprehensively explores multimodal data fusion in EO-RS, highlighting a fundamental and already existing classification framework for fusion techniques. This survey presents a unique classification approach to fusion methods based on their underlying analytical paradigm, which fundamentally distinguishes between emerging and established techniques. Special consideration is given to pre-processing strategies critical for preserving modal integrity and ensuring fusion fidelity. Through detailed case studies, we demonstrate how the fusion of disparate modalities such as optical, radar, LiDAR, and hyperspectral data improves model accuracy and scene characterization. We also discuss the challenges inherent in multimodal fusion, including spectral and spatial distortions, computational demands, and data validation constraints. This work provides an overview of the current landscape of EO-RS data fusion and serves as a guide to future research on more robust, domain- and context-aware, and scalable fusion frameworks.