Self-Consistent Recurrent Neural Network for PathDependent Deformation
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Using data-driven machine learning (ML) models as surrogates in classical engineering is an emerging trend in the literature.However, effective surrogate modeling in path-dependent problems requires a deep understanding of the fundamental physicalproperties that naturally arise in data obtained from simulations or experiments. While generic ML architectures can capturenonlinear behavior, they may not inherently satisfy the specific temporal constraints dictated by physical processes. This studyexamines the characteristics of deformation paths generated through finite element simulations and identify key modelingrequirements for achieving physically meaningful predictions. One important requirement is that future inputs do not influencepast outputs, a property typically satisfied by most surrogate ML models, yet rarely acknowledged or formalized. Thisrequirement, often called the truncation condition, is essential for achieving physically meaningful predictions. Another closelyrelated requirement is consistency across different time discretizations, which remains an active and important topic indeformation history modelling. To address these requirements, we propose a customized and adaptable Recurrent NeuralNetwork (RNN) transition function that takes absolute strain inputs and is designed to enforce both truncation and consistency,ensuring robust predictions across varying temporal resolutions. This study provides a foundational step toward physicallyconsistent damage initiation estimation and supports the development of more reliable surrogate models in computationalmechanics.