Secant Optimization Algorithm for Global Optimization Effectively

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper presents the Secant Optimization Algorithm (SOA), a novel mathematics-inspired metaheuristic derived from the Secant Method. SOA enhances search efficiency by repeating vector updates using local information and derivative approximations in two steps: secant-based updates for enabling guided convergence and stochastic sampling with an expansion factor for enabling global search and escaping local optima. The algorithm's performance was verified on a set of benchmark functions, from low- to high-dimensional nonlinear optimization problems, such as the CEC2021 and CEC2020 test suites. In addition, SOA was used for solving real-world applications, such as convolutional neural network hyperparameter tuning on four datasets: MNIST, MNIST-RD, Convex, and Rectangle-I, and parameter estimation of photovoltaic (PV) systems. The competitive performance of SOA, in the form of high convergence rates and higher solution accuracy, is confirmed using comparison analyses with leading algorithms within the field. Moreover, statistical tests and convergence trajectories confirm SOA's robustness and flexibility, rendering it an effective tool in resolving challenging machine learning and engineering optimization complications. The source code is available at https://github.com/MohammedQaraad/SOA

Article activity feed