A Non-Intrusive Framework Using Acoustic Signals and Deep Learning for Boiling Diagnostics in Visual-Limited Environments
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Accurate monitoring of boiling heat transfer is critical for safeguarding high-power systems operating in environments where conventional optical diagnostics are hindered by radiation fields or restricted visual accessibility. This study presents a non-intrusive framework that integrates hydroacoustic sensing with deep learning to infer near-wall boiling characteristics and enable predictive thermal assessment without visual access. In a prototypical subcooled flow-boiling facility representative of the Isotope Production Facility (IPF) at Los Alamos, hydrophones capture boiling-induced acoustic emissions that are transformed into background-removed Short-Time Fourier Transform (STFT) spectrograms. A convolutional neural network (CNN) then regresses heat flux, wall superheat, and key bubble parameters directly from these spectrograms. The CNN achieved high predictive accuracy across diverse operating conditions and demonstrated strong robustness under acoustic contamination for Signal-to-Noise Ratios (SNRs) down to approximately 0 dB. When integrated into an ANSYS CFX wall-boiling model, the acoustically inferred parameters reproduced boiling curve and critical heat flux (CHF) values consistent with image-based benchmarks. Furthermore, the model retained reliable performance under moderate variations in bulk temperature, flow rate, and hydrophone placement, confirming its generalizability across practical boundary conditions. These results establish hydroacoustic-based deep learning as a viable path toward real-time, radiation-tolerant boiling diagnostics and predictive thermal safety assessment in inaccessible systems such as the IPF.