Kernel Ridge-Type Shrinkage Estimators in Partially Linear Regression Models with Correlated Errors
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
This paper introduces ridge‐type kernel smoothing estimators for par-tially linear time‐series models that employ shrinkage estimation to han-dle autoregressive errors and severe multicollinearity in the parametric component. By combining a generalized ridge penalty with kernel smoothing, the proposed estimators solve inflated variances arising from linear dependencies among predictors, while also accounting for auto-correlation. Four well-known selection criteria—Generalized Cross Val-idation (GCV), Improved Akaike Information Criterion (AICc), Bayesian Information Criterion (BIC), and Risk Estimation via Classical Pilots (RECP)—are used to optimally choose both the bandwidth and shrinkage parameters. We provide closed‐form expressions for these estimators, establish their asymptotic properties, and present a risk‐based analysis that highlights the benefits of ordinary and positive‐part shrinkage ex-tensions. Simulation studies confirm that the introduced shrinkage ap-proaches outperforms standard methods when predictors are strongly correlated, with this advantage growing as sample sizes increase. An ap-plication to airline delay time‐series data further illustrates the efficacy and practical interpretability of the introduced methodology.