Bootstrap-Based Stabilization of Sparse Solutions in Tensor Models: Theory, Assessment, and Application
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
bstract: Estimating sparse solutions in penalized tensor models poses a significant meth- odological challenge in high-dimensional settings. Although methods such as the Cenet- Tucker model incorporate Elastic Net penalties to achieve sparse and orthogonal repre- sentations, their stability under minimal perturbations remains a critical issue. This article proposes a bootstrap-based extension to stabilize the selected support, formulating a com- prehensive protocol for resampling, model fitting, and evaluation, applicable to penalized tensor models. Stability metrics are introduced, a rigorous simulation study is conducted, and the GSparseBoot package is being actively developed in R as a reproducible and mod- ular computational tool. Although still under development, GSparseBoot already shows promising performance and reflects a strong commitment to accessible, open-source im- plementation of advanced statistical methods. In simulations, the use of calibrated boot- strap increased the Stable Selection Index (SSI) by over 25%, reduced the standard devia- tion of the support by 40%, and improved the average Jaccard Index across replicates— demonstrating a substantial enhancement in the structural consistency of the model. This article presents a well-founded methodological contribution, empirically validated and aligned with modern standards of reproducibility in data science.