Fast-Track Your Abstract Screening: Mastering ASReview for Accelerating Abstract Screening and Evaluating Decisions From Automatic Screening Methods

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Research syntheses, such as systematic reviews and meta-analyses, are crucial for synthesizing research to support evidence-based decision-making. However, the abstract screening phase, in which researchers evaluate titles and abstracts for inclusion, is highly time-consuming and often results in cognitive biases and fatigue. To address these challenges, machine-learning-assisted tools, particularly those using active learning, have gained prominence. One such tool is ASReview, an open-source software for semiautomating title and abstract screening in systematic reviews. ASReview incorporates user feedback to prioritize relevant studies, reducing screening time and improving efficiency. Despite its potential, many researchers remain uncertain about integrating ASReview into their workflows and making evidence-based decisions regarding the tool’s configuration, training, and stopping criteria. This tutorial provides a step-by-step guide to using ASReview, including practical examples from psychological research. We demonstrate the software’s application in two use cases: screening unlabeled abstracts using active learning and verifying results from automated screening methods. The tutorial also offers evidence-based recommendations for selecting stopping rules to balance sensitivity and efficiency. We also outline strategies for prescreening, data set preparation, model setup, and progress monitoring to ensure that researchers can maximize the tool’s benefits while maintaining scientific rigor. By offering evidence-based guidance at each stage of the process for practitioners without coding skills, this tutorial aims to help researchers harness AI-aided screening to enhance the quality and efficiency of research syntheses across disciplines.

Article activity feed