Entropy Change in Sampling Predicts Downstream Performance in Neural Networks
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Data acquisition, the physical interface between the world and learning systems, fundamentally determines how much information is available before any model is trained. Existing evaluation and optimization methods typically rely on learned losses or domain-specific heuristics. Here, we introduce a training-free scalar—band-entropy change—which quantifies how an acquisition process disturbs the spectral structure of the signal. Across three domains—vision (patch masking), wireless multiple-input multiple-output (MIMO) systems (pilot/antenna subsampling), and magnetic resonance imaging (MRI) ($k$-space undersampling$)$—we show experimentally that the magnitude of band-entropy change, computed from raw measurements, provides a useful, training-free indicator of downstream performance: smaller values are consistently associated with higher classification accuracy or reconstruction quality. Our results motivate a task-agnostic, instrument-aware diagnostic—entropy auditing—that can evaluate or guide acquisition choices prior to training. This framework bridges the gap between physical measurement principles and machine learning, offering a physics-informed approach to optimizing sampling strategies and predicting downstream performance across various domains.