Reporting Practices, Open Science Practices, and Trustworthiness of Simulation Studies in Psychology: A Questionnaire Study

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Simulation studies have received little attention in the replication crisis. In this exploratory study, we investigated reporting practices, open science practices, and trustworthiness of simulation studies in psychology. 221 researchers (response rate = 35.1%) completed a questionnaire about one simulation-based article they authored in 2023 or 2024. Authors reported at least some selective reporting of conditions, methods, and performance measures in 50.2% of the articles. Reporting of nonconvergence and Monte Carlo Standard Errors was uncommon, likely due to limited awareness and weakly established norms. Only 19% of articles were neutral (i.e., authors did not develop evaluated methods), but selective reporting was not lower in these studies. Over 75% of authors reported sharing their complete code, but important measures for reproducibility like providing code instruction (39.6%) and independent code checking (20.3%) were uncommon. Guidelines were not used because researchers were unaware of them, perceived them as unnecessary, or considered them uncommon in their field. Authors estimated the probability that conclusions of a typical simulation study are trustworthy to be .74, exceeding estimates for computational reproducibility (.64) and complete reporting (.55). Because our findings are based on self-report and are more favorable than prior literature reviews, prevalence estimates may be optimistic. As responsible reporting standards and code-sharing in simulation studies were not standard, we recommend journals to endorse reporting standards such as ADEMP (Morris et al., 2019), use standardized repositories and workflows for sharing materials, and check for availability of code. Keywords: simulation studies, reproducibility, replicability, open science, QRPs, selective reporting, trustworthiness

Article activity feed