Enabling Real-Time Fluctuation-Based Super Resolution Imaging

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Live-cell imaging captures dynamic cellular processes, but many structures remain beyond the diffraction limit. Fluctuation based super-resolution techniques reach beyond the diffraction limit by exploiting correlations in fluorescence blinking. However, existing methods require the acquisition of hundreds of frames and involve computationally intensive post-processing, which can take tens of seconds, thereby limiting the suitability for real-time sub-diffraction imaging of fast cellular events. To address this, we use a recurrent neural network model, which integrates sequential low-resolution frames to extract spatio-temporally correlated signals. By creating a synthetic dataset consisting of blinking emitters and using super-resolution optical fluctuation imaging (SOFI) to reconstruct our super-resolution targets, we developed a deep-learning based real-time super-resolution fluctuation imaging method (RESURF). Our method significantly improves temporal resolution by reducing the required number of frames down to only 8 frames and also doubles the spatial resolution. By testing on different datasets, we show that our method offers the flexibility of training with either synthetic or experimental dataset and can generalize to different structures. We also demonstrate a 400-fold reduction in computational latency compared to SOFI, by achieving inference time less than 30 ms, highlighting its promise both as an efficient high-throughput imaging and real-time solution for live-cell super-resolution imaging.

Article activity feed