Machine Learning Based Model For Design and Optimization of Rectangular Patch Antenna at 3.5 GHz for 5G Applications

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper presents the design and simulation of a low-profile patch antenna for 5G applications at 3.5 GHz. The substrate material selected is Rogers RT/Duroid 5880 epoxy, which has a permittivity of 2.2. The proposed antenna layout is simulated using Computer simulation Technology (CST) Microwave Studio Suite. The simulation results indicate significant variations in S11, gain, directivity, efficiency, and bandwidth due to differences in the substrate’s relative permittivity and thickness. The designed antenna achieved a return loss of -43.85 dB, VSWR of 1.015, gain of 8.90 dBi, directivity of 10.5 dBi, bandwidth of 177 MHz, and efficiency of 85.7%. With a gain exceeding 5 dBi, the antenna is well-suited for communication applications. The paper also develops a mathematical model employing the Multivariate Polynomial algorithm to predict three antenna parameters: Return Loss, VSWR, and Bandwidth. A machine learning-based model has been developed using the Levenberg-Marquardt algorithm with a feed-forward back-propagation learning approach and applied to patch antenna design. It processes input data, including the dielectric constant (εr), substrate thickness (hs), and dominant-mode resonant frequency (fr), to predict the S11, VSWR, and Bandwidth. The model’s performance has been evaluated by comparing its results with simulated values obtained from CST Studio Suite. The machine learning predictions closely align with the simulation results. Additionally, the neural network-based estimation offers the advantage of rapid and simultaneous output computation.

Article activity feed