From Deterministic to Machine Learning: A New Approach for Estimating the Atmospheric TurbulenceAuthor´s information

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This study proposes a machine-learning (ML) replacement for the Holtslag–Boville (1993) turbulence parameterization in the BAM-1D model. An optimal multi-layer perceptron neural network was trained to reproduce eddy diffusivities and counter-gradient terms by using data from the GoAmazon experiment, achieving excellent agreement with the reference scheme, with correlations of 0.9996 for the momentum eddy diffusivity (kvm) and 0.9968 for the scalar eddy diffusivity (kvh). The ML-based turbulence driver preserved the dynamic behavior of temperature, wind components, and precipitation, showing no artificial signals or instability. Computational analysis based on 87,120 samples shows that the neural-network routine is only ~ 0.09 ms (≈ 3 — 4%) slower per call than the Holtslag–Boville scheme (2.51 ms vs. 2.42 ms), implying an expected impact below 2% on total runtime in a full BAM-3D configuration, the operational version for the National Institute for Space Research (INPE), Brazil. Overall, the results demonstrate that data-driven turbulence parameterizations can reproduce traditional closures with high accuracy and stability while remaining computationally competitive for atmospheric modeling.

Article activity feed