High-Resolution 3D Thermal Mapping: From Dual-Sensor Calibration to Thermally Enriched Point Clouds

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Thermal imaging is increasingly applied in remote sensing to identify material degradation, monitor structural integrity, and support energy diagnostics. However, its adoption is limited by the low spatial resolution of thermal sensors compared to RGB cameras. This study proposes a modular pipeline to generate thermally enriched 3D point clouds by fusing RGB and thermal imagery acquired simultaneously with a dual-sensor unmanned aerial vehicle system. The methodology includes geometric calibration of both cameras, image undistortion, cross-spectral feature matching, and projection of radiometric data onto the photogrammetric model through a computed homography. Thermal values are extracted using a custom parser and assigned to 3D points based on visibility masks and interpolation strategies. Calibration achieved 81.8% chessboard detection, yielding subpixel reprojection errors. Among twelve evaluated algorithms, LightGlue retained 99% of its matches and delivered a reprojection accuracy of 18.2% at 1 px, 65.1% at 3 px and 79% at 5 px. A case study on photovoltaic panels demonstrates the method’s capability to map thermal patterns with low temperature deviation from ground-truth data. Developed entirely in Python, the workflow integrates into Agisoft Metashape or other software. The proposed approach enables cost-effective, high-resolution thermal mapping with applications in civil engineering, cultural heritage conservation, and environmental monitoring applications.

Article activity feed