Beyond Logistic Regression: Calibration With Dropouts In Tiny Neural Networks

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Neural networks, such as Artificial Neural Networks (ANNs), often surpass simpler models in classification tasks but tend to exhibit overconfidence and poor calibration. This study examines the trade-offs between accuracy, calibration, and model complexity by comparing a compact dropout-regularized ANN with logistic regression on a real-world weather dataset containing approximately 145,000 samples. Reformulated as a three-class classification task for rainfall intensity prediction, the dataset is used to evaluate both models under two settings: the original feature space and a reduced feature space obtained via LinearDiscriminant Analysis (LDA). Evaluation metrics include classification accuracy, Expected Calibration Error (ECE), training time, etc. The ANN with dropout achieves the highest accuracy (82.57%) and best calibration (ECE = 0.0030), while logistic regression remains competitive despite its simplicity and smaller parameter footprint. LDA effectively reduces dimensionality from 16 to 2 with minimal performance loss, enabling faster training. These results highlight the utility of dropout for improving uncertainty calibration and emphasize the practicality of simple models in constrained environments, with additional significance drawn from the use of real-world data.

Article activity feed