Optimizing the Figure of Merit for DGTFET Ferroelectric Devices Using a Machine Learning-using Genetic Algorithm

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

In this research, we conducted in-depth analysis of the application of ferroelectric tunneling (FeTFET) for emerging complex neural networks. We explored the use of Neural Networks (ANN) to optimize the I OFF -state current in a dual-gate FeDGTFET tunnel transistor structure, incorporating innovative materials such as ferroelectric BaTiO 3 and hafnium dioxide HfO 2 as a high permittivity gate oxide. This study considered specific features of the FeDGTFET structure, including doping and permittivity, while examining the complex interactions between synapses, weights, and dendrites within this configuration. By applying the back-projection algorithm based on gradient descent principles, we aimed to minimize model error and adjust structure parameters for improved accuracy. Subsequently, we used fitting techniques to align the model with experimental data, considering the unique properties of the high permittivity oxides. Finally, utilizing a genetic algorithm (GA), we optimized the model to predict I OFF current with enhanced accuracy, assessing performance through metrics such as Mean Squared Error (RMSE) and R-squared (R²) value. The results of this study demonstrate that the GA-Optimized Neural Network model shows promising potential for predicting I OFF current in FET tunnel transistors based on BaTiO 3 ferroelectrics and high permittivity oxides. The database was integrated through a communication interface between TCAD-SILVACO and Matlab.

Article activity feed