Noise-Aware Event-Based Gaussian Splatting for Robust 3D Reconstruction
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Three-dimensional (3D) reconstruction — the digital recovery of an object’s or scene’s geometry — is fundamental to healthcare, autonomous driving, computer graphics and architecture. Although neural radiance fields (NeRF) and Gaussian splatting have advanced camera-based pipelines, these approaches remain vulnerable to motion blur, defocus and poor illumination, which degrade geometric accuracy and visual fidelity. We present a noise-aware, event-driven Gaussian-splatting framework that harnesses event cameras—sensors that asynchronously record per-pixel brightness changes with microsecond latency and a high dynamic range—to overcome these limitations. Specifically, our method (i) reconstructs high-fidelity 3D geometry solely from event streams without relying on RGB imagery, and (ii) incorporates a hot-pixel filtering technique within COLMAP to reduce sensor-induced noise; (iii) introduces a brightness-aware loss that sharpens fine details; and (iv) incorporates an optical-flow regularizer that enforces view-to-view structural consistency. By combining the blur- and low-light robustness of event sensing with the computational efficiency of Gaussian splatting, the proposed approach produces accurate, photorealistic 3D reconstructions.