Gaps between Open Science activities and actual recognition systems: Insights from an international survey

This article has been Reviewed by the following groups

Read the full article

Abstract

There are global movements aiming to promote reform of the traditional research evaluation and reward systems. However, a comprehensive picture of the existing best practices and efforts across various institutions to integrate Open Science into these frameworks remains underdeveloped and not fully known. The aim of this study was to identify perceptions and expectations of various research communities worldwide regarding how Open Science activities are (or should be) formally recognised and rewarded. To achieve this, a global survey was conducted in the framework of the Research Data Alliance, recruiting participants from five continents and 37 countries. Despite most participants reporting that their organisation had one form or another of formal Open Science policies, the majority indicated that their organisation lacks any initiative or tool that provides specific credits or rewards for Open Science activities. For instance, researchers from France, the United States, the Netherlands and Finland affirmed having such mechanisms in place. The study found that, among various Open Science activities, Open or FAIR data management and sharing stood out as especially deserving of explicit recognition and credit. Open Science indicators in research evaluation and/or career progression processes emerged as the most preferred type of reward.

Article activity feed

  1. This Zenodo record is a permanently preserved version of a Structured PREreview. You can view the complete PREreview at https://prereview.org/reviews/11234562.

    Does the introduction explain the objective of the research presented in the preprint? Yes The abstract and introduction were written clearly and concisely. In particular, the abstract provided a plain-language summary of the main findings from the survey.
    Are the methods well-suited for this research? Somewhat appropriate While the authors report that there were 230 responses to the survey, it is unclear what the underlying population was making it difficult to assess a response rate. This is known challenge of using snowball sampling (trading off a greater number of responses with an uncertain baseline). The authors should disclose the completion rates for survey completion as well - it's likely that, with only 19 questions, that drop-out rates were very low and completion was high.
    Are the conclusions supported by the data? Highly supported As the results rely solely on descriptive analysis and the authors clearly describe the sample frame limitations on generalizability, these results make complete sense even to a reader with low familiarity with the topic.
    Are the data presentations, including visualizations, well-suited to represent the data? Somewhat appropriate and clear While I am a fan of pie charts for simple descriptions - they are very difficult for human readers to make comparative contrasts using their interocular percussion (i.e., by looking at it). Figure 1, adds a bit of complication to this as a helioplot/radiograph where not all information is properly labeled. Additional presentation of this information in a table would be useful.
    How clearly do the authors discuss, explain, and interpret their findings and potential next steps for the research? Somewhat clearly The summary and interpretation of results is spot on from the data presented. However, the authors could do more to explain both the "so what" of the results and the "what's next" based on both their findings and any natural outgrowths of those findings. From a research perspective, what gaps remain that need to be filled? Should there be a more comprehensive study with a more generalizable sample? From a policy perspective, how should the open science advocates and policymakers proceed with translating these findings into practice, policy, or otherwise shape their perspective? From a researcher perspective, how should researchers seek credit for practicing open and FAIR data management?
    Is the preprint likely to advance academic knowledge? Somewhat likely
    Would it benefit from language editing? Yes There are citation formatting issues - many of the in-line citations are linked to Zotero rather than being linked to the in-document list of references, making look-up cumbersome for the reader. Rather than linking to Zotero, which readers may not have access to, the citations should either link to the DOIs of the underlying references or to the bibliography in the paper. Other in-line citations that are referenced more than once lack accessible hyperlinks altogether (i.e., see citation 11 Hahn et al. on page 13 goes no where as an example). This is an accessibility issue that should be addressed in the next revision.
    Would you recommend this preprint to others? Yes, it's of high quality
    Is it ready for attention from an editor, publisher or broader audience? Yes, after minor changes

    Competing interests

    The author declares that they have no competing interests.