How to construct a robust search strategy for systematic reviews? a methodological proposal followed by a tutorial

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background: A methodological issue has been identified in systematic reviews: the lack of standardization in search query formulation and the difficulty in developing them, which compromises evidence quality by yielding suboptimal results. Methods: To analyze search queries and propose a standardized model for their development, we evaluated a total of 30 open-access systematic reviews on primary health care selected from PubMed, along with 30 studies covering topics related to social sciences obtained from Campbell Collaboration publications. To analyze database main syntaxes and propose a standardized model, we examined the functionalities of PubMed, Embase, Web of Science, and Scopus and built a tutorial for the creation of queries aiming for an optimal result range. For the model, we propose the use of essential descriptors according to the research question, synonyms registered in the MeSH platform and syntax adaptations for the cited databases. We applied our model of multiple search queries (n=19) being one published, six ongoing, nine classroom-developed and three reformulated systematic reviews. Results: In systematic reviews on primary health care, 10/30 report the research query in the main text, the same proportion for systematic reviews on topics related to social sciences. However, only three of 30 search queries in primary health care reviews were easily extracted and re-executed. Among the main functionalities of the analyzed databases, [MeSH] was identified as a key feature in PubMed and /exp in Embase, with comma usage being a major cause of command failures. The hypothetical range for optimal results suggests that an ideal mean after converting the number of returned publications to a maximum normalization from 0 to 1 is between 0.43 and 0.81. The developed tutorial includes various examples and images to enhance understanding. Conclusions: The findings indicate a lack of reproducibility in search query formulation and standardization, highlighting a gap in the application of the scientific method. To address this issue, our standardized development model is accompanied by a didactic tutorial to ensure methodological quality for building a well-structured search query. The application of a reproducible method for systematic reviews ensures methodological quality and is essential for ensuring evidence reliability, preventing suboptimal outcomes in clinical decision-making.

Article activity feed