Can Results-Blind Selection Improve Science Communication?

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Journalists are often maligned for covering sensational or desirable research results at the expense of studies with stronger methods. The present study tests whether such journalistic bias exists, and if it is reduced when studies are selected based on their methods, rather than results (results-blind selection). Practicing journalists, editors, journalism faculty, and journalism graduate students (N = 413) read summaries of real social psychology studies and rated their interest in reporting on them. Participants were randomly assigned to read summaries that included (traditional) or excluded (results-blind) the results. Summaries varied on three within-subject dimensions: replication status, pre-registration status, and belief-consistency. Participants expressed more interested in replicable (vs. not replicable) and pre-registered (vs. non-pre-registered) studies, regardless of whether they learned the results, countering the notion that journalists favor research that rests on shaky methodological foundations. Meanwhile, results-blind selection was effective in reducing confirmation bias, suggesting it may be worth further exploration as a tool for improving science communication.

Article activity feed