Dense Approximation of Learnable Problems with Streamable Problems
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
We study the relationship between learnable and streamable optimization problems within the established three-tier hierarchy based on α-averaged operators. The central question is whether the computational restrictions of streamable optimization significantly limit the class of problems that can be solved. We prove that streamable problems are dense in learnable problems under a uniform residual metric, meaning every learnable problem can be approximated arbitrarily closely by a streamable variant. This result is constructive: we provide an explicit tensor-based algorithm that converts any learnable problem into a streamable approximation with controllable error. We demonstrate the theory through complete analysis of ReLU network training and provide numerical validation on both synthetic data and MNIST classification, showing computational reductions of 2.5x to 25x with controllable accuracy loss.