Algorithms for Solving Ordinary Differential Equations Based on Orthogonal Polynomial Neural Networks

Read the full article

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This article proposes single-layer neural network algorithms for solving second-order ordinary differential equations, based on the principles of functional connection. According to this principle, the hidden layer of the neural network is replaced by a functional expansion unit to improve input patterns using orthogonal Chebyshev, Legendre, and Laguerre polynomials. The polynomial neural network algorithms were implemented in the Python programming language using the PyCharm environment. The performance of the polynomial neural network algorithms was tested by solving initial-boundary value problems for the nonlinear Lane–Emden equation. The solution results are compared with the exact solution of the problems under consideration, as well as with the solution obtained using a multilayer perceptron. It is shown that polynomial neural networks can perform more efficiently than multilayer neural networks. Furthermore, a neural network based on Laguerre polynomials can, in some cases, perform more accurately and faster than neural networks based on Legendre and Chebyshev polynomials. The issues of overtraining of polynomial neural networks and scenarios for overcoming it are also considered.

Article activity feed