Bootstrap-Based Stabilization of Sparse Solutions in Tensor Models: Theory, Assessment, and Application
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
This paper introduces BCenetTucker, a novel bootstrap-enhanced extension of the CenetTucker model designed to address the instability of sparse support recovery in high-dimensional tensor settings. By integrating mode-specific resampling directly into the penalized tensor decomposition process, BCenetTucker improves the reliability and reproducibility of latent structure estimation without compromising the model′s interpretability. The proposed method is systematically benchmarked against classical CenetTucker, Stability Selection, and Bolasso, using real-world gene expression data from the GSE13159 leukemia dataset. Across multiple stability metrics—including support-size deviation, average Jaccard index, inclusion frequency, proportion of stable support, and Stable Selection Index (SSI)—BCenetTucker consistently demonstrates superior robustness and structural coherence relative to competing approaches. In the real data application, BCenetTucker preserved all essential signals originally identified by CenetTucker while uncovering additional marginal yet reproducible features. The method achieved high reproducibility (Jaccard index = 0.975; support-size deviation = 1.7 genes), confirming its sensitivity to weak but stable signals. The protocol was implemented in the GSparseBoot R library, enabling reproducibility, transparency, and applicability to diverse domains involving structured high-dimensional data. Altogether, these results establish BCenetTucker as a powerful and extensible framework for achieving stable sparse decompositions in modern tensor analytics.