Adaptive Voronovskaya-Type Expansions and Sobolev-Santos Uniform Convergence for Symmetrized Hyperbolic Tangent Neural Networks

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This work introduces a novel class of multivariate neural network operators activated by symmetrized and perturbed hyperbolic tangent functions, with a focus on the \textbf{Sobolev-Santos Uniform Convergence Theorem}. The operators basic, Kantorovich, and quadrature types are analyzed through Voronovskaya-type asymptotic expansions, providing rigorous convergence rates for approximating continuous functions and their derivatives in Sobolev spaces \(W^{s,p}(\mathbb{R}^N)\). The proposed symmetrization method enhances both approximation power and regularity, enabling precise asymptotic descriptions as the network size increases. The study establishes uniform convergence rates in \(L^p\) and Sobolev norms, explicitly quantifying the impact of smoothness, dimensionality, and grid parameters. The \textbf{Sobolev-Santos Theorem} ensures uniform stability of these expansions under parametric variations of the activation function, guaranteeing robustness across different configurations. The results highlight the superior performance of these operators in high-dimensional approximation problems, with implications for artificial intelligence, data analytics, and numerical analysis. The explicit constants and uniform bounds provided offer a solid foundation for both theoretical and applied research in neural network-based function approximation.

Article activity feed