Voronovskaya Expansions and Spectral Convergence for Neural Operators on Complex Foliated Manifolds
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
This paper introduces a rigorous geometric-analytic framework for symmetrized neural operators defined on holomorphic foliations in complex manifolds. By integrating tools from complex geometry, functional analysis, and neural operator theory, we establish a novel class of operators that respect the intrinsic symmetries imposed by the foliation's holonomy pseudogroup. Our analysis proves universal approximation theorems in leafwise Sobolev and \( C^k \) spaces, demonstrating that symmetrized neural operators can approximate arbitrary smooth functions on both compact and non-compact leaves with explicit convergence rates. A key result is the derivation of Voronovskaya-type asymptotic expansions, which reveal a deep connection between these operators and the leafwise Laplace--Beltrami operator, providing precise leading-order behavior and remainder estimates. We further establish \( L^p \)-stability and spectral decomposition results, linking the dynamics of symmetrized neural operators to the underlying foliation geometry. The framework extends to singular foliations and non-compact leaves, introducing weighted Sobolev spaces and spectral convergence results near singularities. These operators are interpreted as discrete approximations to leafwise diffusion processes, bridging operator-theoretic neural networks with classical complex dynamical systems. The results open new directions in high-dimensional approximation theory, spectral analysis, and geometrically constrained machine learning, with applications to Calabi--Yau manifolds and logarithmic foliations.