Enhanced Antenna Design through Hyper parameter Optimization of Diverse Machine Learning Models Using Bayesian Optimization

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This work investigates the use of machine learning (ML) models for microstrip patch antenna design optimization with Bayesian optimization. Based on datasets produced by CST Microwave Studio 2023 simulations, key antenna parameters, including resonance frequency, bandwidth, and return loss, were first predicted using Support Vector Regressor (SVR), k-Nearest Neighbor (KNN), and Gradient Boosting Regressor (GBR) models. With slot distance, patch length, and patch width as target parameters, pre-processing was used to transform CST output into structured input-output pairs in order to get the dataset ready for machine learning training. Extending this first method, we assessed ten machine learning models, each optimized with Bayesian hyperparameter tuning: SVR, KNN, GBR, Random Forest, XGBoost, Decision Tree, Stochastic Gradient Descent, Artificial Neural Network, Gaussian Process Regressor, and Linear Regression. By fine-tuning parameters like max_depth, n_estimator, Bayesian optimization greatly improved complicated models, lowering Mean Squared Error (MSE) and Root Mean Squared Error (RMSE) while raising R2 scores. After optimization, the Random Forest and XGBoost models produced the best predicted accuracy, according to comparative data. To further enable real-time model training, testing, and performance visualization, a unique graphical user interface (GUI) was created, offering a useful tool for interactive antenna optimization. This system provides a solid basis for data-driven improvements in advanced engineering applications by showcasing how ML models combined with Bayesian tuning can successfully handle challenging antenna design problems.

Article activity feed