One Image, All Aberrations: A Universal Framework for Wavefront Correction

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Optical aberrations have been explored for decades owing to their important role in many applications, ranging from microscopy, astronomy, and quantum imaging, but also critical to the performance of technologies woven into daily life, from smartphone cameras and eyeglass lenses to autonomous vehicles and optical networks. Despite sustained advances in wavefront shaping, optical aberrations continue to impose fundamental limits on imaging performance, with all existing correction strategies constrained by inherent trade-offs in accuracy, speed, generality, and physical scalability. Complicated platforms are often required to detect and correct optical aberrations, making the system bulky, restricted to narrowband operation, challenging to fabricate, and of limited compatibility with broadband, non-paraxial, or structured fields. Here, we report a unified, single-shot wavefront correction framework that departs fundamentally from conventional approaches. Leveraging a hybrid deep learning architecture, our method directly retrieves all phase distortions from a single focal-plane intensity image, thus circumventing the need for defocused inputs and iterative procedures. The framework is intrinsically broadband, adaptable, and achieves diffraction-limited performance across a wide range of structured light fields, including orbital angular momentum (OAM), Hermite-Gaussian (HG), and Laguerre-Gaussian (LG) modes. This approach not only collapses the wavefront sensing pipeline into a single computational step but also establishes a scalable foundation for real-time aberration correction in next-generation optical systems.

Article activity feed