Performance Analysis of Backpropagation Artificial Neural Networks with Various Activation Functions and Network Sizes
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
This paper conducts a comprehensive performance analysis of Back Propagation Artificial Neural Networks (BP-ANNs) utilizing various activation functions. Activation functions play a crucial role in shaping neural networks' behavior and learning capabilities. Through systematic evaluation across diverse network sizes (numbers of hidden layers and neurons), this study assesses the impact of commonly employed activation functions—such as Sigmoidalm, Tanh, Cloglog, Aranda, and others—on the convergence speed and accuracy of BP-ANNs. The findings provide empirical insights essential for optimizing neural network artificial intelligence architectures tailored to specific applications and datasets.