Physics-Informed Generative Model for 3D Localization Microscopy
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Localization microscopy techniques have overcome the diffraction limit, enabling nanoscale biological imaging by precisely determining the positions of individual emitters. However, the performance of deep learning methods commonly applied to these tasks often depends significantly on the quality of training data, typically generated through simulation. Creating simulations that perfectly replicate experimental conditions remains challenging, resulting in a persistent simulation-to-experiment (sim2exp) gap. To bridge this gap, we propose a physics-informed generative model leveraging self-supervised learning directly on experimental data. Our model extends the Deep Latent Particles (DLP) framework by incorporating a physical Point Spread Function (PSF) model into the decoder, enabling it to disentangle learned realistic environments from precise emitter properties. Trained directly on unlabeled experimental images, our model intrinsically captures realistic background, noise patterns, and emitter characteristics. The decoder thus acts as a high-fidelity generator, producing fully labeled, realistic training images with known emitter locations. Using these generated datasets significantly improves the performance of supervised localization algorithms, particularly in challenging scenarios such as complex backgrounds and low signal-to-noise ratios. Our results demonstrate substantial improvements in localization accuracy and emitter detection, underscoring the practical benefit of our approach for real-world microscopy applications. We will make our code publicly available.