Machine Learning-Based Tilt Correction for Millimeter-Wave Radar Measurements

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Addressing the inclination error issue in radar water level measurement, this study investigates the fitting accuracy of three model algorithms—Random Forest, GBDT, and XGBoost—using 80 GHz millimeter-wave radar. Based on machine learning nonlinear regression and error correction principles, the research demonstrates that machine learning significantly enhances the stability and reliability of data acquisition in radar water level measurement. XGBoost demonstrated superior performance, achieving a relative improvement rate of 61.18% compared to the other two algorithms under identical conditions. This finding provides guidance for enhancing the reliability of radar water level measurements in complex field environments.

Article activity feed