Meta-learning physics-informed neural networks for few-shot parameter inference

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Physics-informed neural networks require extensive retraining for each new problem, limiting applicability in few-shot scenarios. We develop a meta-learning framework enabling rapid adaptation to new physics problems with minimal data while maintaining physical consistency. We combine gradient-based meta-learning with physics-informed constraints, incorporating adaptive constraint weighting and automated physics discovery. The framework optimizes for parameters enabling fast adaptation through gradient descent while enforcing partial differential equation constraints in both inner and outer optimization loops. Experimental validation on fluid dynamics problems demonstrates 92.4% validation accuracy (standard deviation = 4.2%, 95% confidence interval [88.2%, 96.6%]) compared to 83.0% for transfer learning baselines (p $<$ 0.001, Cohen d = 2.1). Our method requires 3 times fewer adaptation steps (50 vs 150 steps) and achieves 15% improvement in generalization performance. Automated physics discovery identifies causal relationships with 94% accuracy across Navier-Stokes, heat transfer, and Burgers equation problems. Meta-learning physics-informed neural networks enables few-shot learning for computational fluid dynamics with theoretical convergence guarantees and practical efficiency gains, advancing physics-informed machine learning for resource-constrained applications. GitHub: https://github.com/YCRG-Labs/meta-pinn

Article activity feed