Using heteroskedasticity-consistent standard errors and the bootstrap for linear regression analysis in SPSS: A tutorial

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

In the landscape of statistical software, from customizable programming-language-based to point-and-click systems, SPSS remains a popular choice among researchers. In SPSS (version 29), analyses with conventional methods, like ordinary least squares regression, can be easily performed. However, not satisfying assumptions like homoskedasticity or normality of the errors can lead to altered type I error rates or a reduction in statistical power. SPSS provides a multitude of alternative inference methods associated with linear regression, but many of them are not plainly locatable through the point-and-click menu. To facilitate data analysis when assumptions for conventional inference methods are not met, this tutorial provides applied researchers with a step-by-step guide for performing linear regression with heteroskedasticity- consistent standard errors (HC3 and HC4) and two different bootstrap resampling methods (pairs bootstrap and wild bootstrap). Each bootstrap method can further be combined with a bootstrap p-value, a percentile confidence interval, or a bias-corrected and accelerated confidence interval. The methods are then compared using a computer-generated data set. Although the main focus of this article is SPSS, a tutorial on how to do everything shown here in R (with custom functions) is included in the supplementary materials.

Article activity feed