A Systematic Evaluation of Wording Effects Modeling Under the Exploratory Structural Equation Modeling Framework
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Wording effects, the systematic method variance arising from the inconsistent responding to positively and negatively worded items of the same construct, are pervasive in the behavioral and health sciences. Although several factor modeling strategies have been proposed to mitigate their adverse effects, there is limited systematic research assessing their performance with exploratory structural equation models (ESEM). The present study evaluated the impact of different types of response bias related to wording effects (random and straight-line carelessness, acquiescence, item difficulty, and mixed) on ESEM models incorporating two popular method modeling strategies, the correlated traits-correlated methods minus one (CTC[M-1]) model and random intercept item factor analysis (RIIFA), as well as the “do nothing” approach. Five variables were manipulated using Monte Carlo methods: the type and magnitude of response bias, factor loadings, factor correlations, and sample size. Overall, the results showed that ignoring wording effects leads to poor model fit and serious distortions of the ESEM estimates. The RIIFA approach generally performed best at countering these adverse impacts and recovering unbiased factor structures, whereas the CTC(M-1) models struggled when biases affected both positively and negatively worded items. A straightforward guide is offered to applied researchers who wish to use ESEM with mixed-worded scales.