Qualitative tools to understand the capability, opportunity and motivation behind researcher behavior: a scoping review

This article has been Reviewed by the following groups

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Abstract

Background: Qualitative approaches offer tools to explore complex behaviors and perceptions, including those of researchers themselves. In recent years, the meta-research field has drawn on behavioral science to examine and improve research practices, employing frameworks such as COM-B (Capability, Opportunity, Motivation – Behavior), the Behavior Change Wheel (BCW), and the Theoretical Domains Framework (TDF). While these frameworks have been applied in studies of researcher behavior, it remains unclear how they have been used to inform the development and validation of qualitative data collection instruments designed to assess researchers’ capability, opportunity, and motivation. Aim: To map and present the available information on tools developed to assess capability, opportunity, and motivation in qualitative studies concerning researcher behavior. Methods: A search strategy was developed following PRESS (Peer Review of Electronic Search Strategies) principles and applied to the following databases: PubMed, Embase, Scopus, Web of Science PsycINFO, CINAHL and ProQuest Theses and Dissertations, following the JBI methodology for scoping reviews. Results: The applied search strategy resulted in 10,627 publications. After excluding duplicates and those not meeting the inclusion criteria, a total of 27 publications were obtained. 81.5% of studies were published open access and we gained access to 89% of data collection instruments. Most studies (93%) developed their data collection instrument, but only 35% mentioned any strategy for validation or pilot testing. All 27 studies used behavioral models in the development of data collection tools (guide/questionnaire), 85% used models to present results, 78% used them in the discussion and 59% presented the models in their theoretical framework. Discussion: We identified a significant movement toward the publicity of qualitative behavioral instruments, but still with significant limitations in transparency and theoretical application. These findings highlight the need for clearer guidelines on the design, reporting, and sharing of qualitative tools for behavioral research, as well as a stronger integration of behavioral theory and qualitative methodology. Conclusion: There is a growing sharing of qualitative behavioral instruments, but important limitations remain in the transparency and widespread dissemination in the integration of behavioral models. Presenting greater detail at all stages of the research could improve the assessment and re-use of existing tools. Registration: The study protocol was registered prior to data extraction and is available at https://doi.org/10.17605/OSF.IO/4KWTD.

Article activity feed

  1. This Zenodo record is a permanently preserved version of a PREreview. You can view the complete PREreview at https://prereview.org/reviews/17781579.

    This review is the result of a virtual, collaborative Live Review discussion held on Tuesday, November 11, 2025, and Friday, November 14, 2025, organized and hosted by the PREreview Club of Future of Research Communication and e-Scholarship (FORCE11). The discussion was joined by 9 people: 3 facilitators, 4 review authors and 2 discussion participants (Ava Chan and Mercury Shitindo). We thank all participants whose thoughtful and generous contributions made this review possible. The review authors dedicated additional asynchronous time over the course of two weeks to synthesize the notes and prepare this final report. A final synthesis was prepared by Rosario Rogel-Salazar and reviewed by all review authors.

    Summary

    This scoping review investigates how qualitative tools—interview guides, focus-group guides, open-ended questionnaires—have been developed, applied, and validated to study researcher behavior using behavioral frameworks such as COM-B, TDF, BCW, and A-COM-B. Using a preregistered protocol and PRISMA-ScR guidelines, the authors identified 27 studies published through September 2024 and extracted detailed information about methodological design, theoretical integration, openness, and instrument availability.

    The review makes a valuable contribution by mapping an emerging methodological landscape, highlighting widespread open-access practices and the proliferation of behavioral-model–informed qualitative instruments. However, transparency around tool development, validation procedures, and theoretical integration remains inconsistent. Strengthening sections on methodological clarity, conceptual framing, and reporting of study selection would significantly enhance the overall utility and rigor of the manuscript.

    Major concerns and feedback

    1. Insufficient transparency in study selection and PRISMA reporting. Reviewers struggled to understand how the initial ~5,000 records were narrowed to 27 included studies. Numerical inconsistencies in the PRISMA flowchart (e.g., identical counts for removed duplicates and ineligible records) further complicate interpretation. Recommendation: Correct PRISMA counts; provide clearer explanations of exclusion criteria; optionally list excluded studies or categorize reasons for exclusion to improve reproducibility.

    2. Incomplete conceptual framing of theoretical integration. The conclusions strongly emphasize underuse or superficial use of behavioral frameworks, yet this normative argument is not sufficiently introduced or justified early in the manuscript. Recommendation: Strengthen the Background by explaining why deep theoretical integration matters, drawing on implementation science and qualitative research design sources cited later in the paper.

    3. Limited search of grey literature and implications for bias. The review did not search key repositories where qualitative tools are often shared (OSF, Zenodo, Figshare, protocols.io). This omission likely reinforces the dominance of the Global North, health-science publications observed in the sample. Recommendation: Explicitly acknowledge this limitation and, if feasible, conduct a supplementary targeted grey literature search.

    4. Ambiguous definition of "open access" and "open tools". The term "open" is used broadly but without clear definitions. It is unclear whether "open" includes supplemental PDFs, licensed repositories, materials available upon request, or fully reusable resources with open licenses. Recommendation: Add operational definitions and coding criteria for openness and, if needed, revise results to reflect consistent categorization.

    5. Ambiguity in inclusion logic for "qualitative tools". About 41% of included studies used questionnaires, which can be qualitative, quantitative, or mixed. Inclusion criteria do not specify how qualitative components were identified. Recommendation: Clarify how the authors determined whether a questionnaire qualified as a qualitative instrument.

    6. Issues of internal consistency. Reviewers noted mismatches in the reported number of included studies (27 vs. 28), discrepancies between the preregistered plan and manuscript (e.g., number of piloted studies), and conflicting statements across sections. Recommendation: Perform a careful consistency check across the manuscript.

    Minor concerns and feedback

    1. Issues in tables and figures

      1. PRISMA numerical errors require correction.

      2. Some author names are inconsistently formatted (e.g., "Hughes, Williamson & Young" should be "Hughes et al.").

      3.  Table 3 could be made more useful by merging "publication status" and "tool availability", or by graphically indicating depth of theoretical integration.

    2. Accessibility and formatting issues

      1. Avoid full-justified text for accessibility.

      2. In the data extraction section, rephrase to avoid unnecessary nested parentheses and clarify ambiguous phrasing (e.g., "where the sample is inserted").

      3. Provide short definitions or consistent use of acronyms (COM-B, BCW, TDF, etc.).

    3. Citation and reference inconsistencies

      1. Add citations for PCC framework, and theoretical integration literature, and the seven previously known articles used to develop the search strategy.

    Concluding remarks

    This scoping review provides an important contribution to qualitative meta-research by mapping how behavioral frameworks are used to study researcher behavior. Its strengths include a transparent pre-registered protocol, clear reporting of data extraction, and extensive descriptive analysis of the selected studies. Addressing the major methodological and reporting concerns identified above—particularly those related to transparency, conceptual framing, and definitions of openness—will significantly increase the clarity and practical value of the work.

    The manuscript will be valuable not only to researchers designing qualitative behavioral studies, but also to methodologists, open science practitioners, funders, journal editors, and graduate advisors who support rigorous and theoretically grounded research practices.

    Competing interests

    The review authors declare no competing interests related to the authors or content of this preprint.

    This review represents the opinions of the authors and does not represent the position of FORCE11 as an organization.

    Use of Artificial Intelligence (AI)

    The authors declare that they did not use generative AI to come up with new ideas for their review.

    Competing interests

    The authors declare that they have no competing interests.

    Use of Artificial Intelligence (AI)

    The authors declare that they did not use generative AI to come up with new ideas for their review.