Showing page 1 of 1 pages of list content

  1. Evolution of Peer Review in Scientific Communication

    This article has 1 author:
    1. Dmitry Kochetkov
    This article has been curated by 1 group:
    • Curated by MetaROR

      This article provides a brief history and review of peer review. It evaluates peer review models against the goals of scientific communication, expressing a preference for publish, review, curate (PRC) models. The review and history are useful. However, the articleā€™s progression and arguments, along with what it seeks to contribute to the literature need refinement and clarification. The argument for PRC is under-developed due to a lack of clarity about what the article means by scientific communication. Clarity here might make the endorsement of PRC seem like less of a foregone conclusion.

      As an important corollary, and in the interest of transparency, I declare that I am a founding managing editor of MetaROR, which is a PRC platform. It may be advisable for the author to make a similar declaration because I understand that they are affiliated with one of the universities involved in the founding of MetaROR.

      Recommendations from the editor

      I strongly endorse the main theme of most of the reviews, which is that the progression and underlying justifications for this articleā€™s arguments needs a great deal of work. In my view, this articleā€™s main contribution seems to be the evaluation of the three peer review models against the functions of scientific communication. I say ā€˜seems to beā€™ because the article is not very clear on that and I hope you will consider clarifying what your manuscript seeks to add to the existing work in this field.

      In any case, if that assessment of the three models is your main contribution, that part is somewhat underdeveloped. Moreover, I never got the sense that there is clear agreement in the literature about what the tenets of scientific communication are. Note that scientific communication is a field in its own right. C

      I also agree that paper is too strongly worded at times, with limitations and assumptions in the analysis minimised or not stated. For example, all of the typologies and categories drawn could easily be reorganised and there is a high degree of subjectivity in this entire exercise. Subjective choices should be highlighted and made salient for the reader.

      Note that greater clarity, rigour, and humility may also help with any alleged or actual bias.

      Some more minor points are:

      1. I agree with Reviewer 3 that the ā€˜weā€™ perspective is distracting.

      2. The paragraph starting with ā€˜Neverthelessā€™ on page 2 is very long.

      3. There are many points where language could be shortened for readability, for example:

        • Page 3: ā€˜decision on publicationā€™ could be ā€˜publication decisionā€™.
      *   Page 5: ā€˜efficiency of its utilizationā€™ could be ā€˜its efficiencyā€™.
          
      
      *   Page 7: ā€˜It should be notedā€¦ā€™ could be ā€˜Note thatā€¦ā€™.
      
      1. Page 7: ā€˜It should be noted that..ā€™ ā€“ this needs a reference.

      2. Iā€™m not sure that registered reports reflect a hypothetico-deductive approach (page 6). For instance, systematic reviews (even non-quantitative ones) are often published as registered reports and Cochrane has required this even before the move towards registered reports in quantitative psychology.

      3. I agree that modular publishing sits uneasily as its own chapter.

      4. Page 14: ā€˜The "Publish-Review-Curate" model is universal that we expect to be the future of scientific publishing. The transition will not happen today or tomorrow, but in the next 5-10 years, the number of projects such as eLife, F1000Research, Peer Community in, or MetaROR will rapidly increaseā€™. This seems overly strong (an example of my larger critique and that of the reviewers).

    Reviewed by MetaROR

    This article has 5 evaluationsAppears in 1 listLatest version Latest activity
  2. Preprint review services: Disrupting the scholarly communication landscape?

    This article has 4 authors:
    1. Susana Oliveira Henriques
    2. Narmin Rzayeva
    3. Stephen Pinfield
    4. Ludo Waltman
    This article has been curated by 1 group:
    • Curated by MetaROR

      Editorial Assessment

      The authors present a descriptive analysis of preprint review services. The analysis focuses on the servicesā€™ relative characteristics and differences in preprint review management. The authors conclude that such services have the potential to improve the traditional peer review process. Two metaresearchers reviewed the article. They note that the background section and literature review are current and appropriate, the methods used to search for preprint servers are generally sound and sufficiently detailed to allow for reproduction, and the discussion related to anonymizing articles and reviews during the review process is useful. The reviewers also offered suggestions for improvement. They point to terminology that could be clarified. They suggest adding URLs for each of the 23 services included in the study. Other suggestions include explaining why overlay journals were excluded, clarifying the limitation related to including only English-language platforms, archiving rawer input data to improve reproducibility, adding details related to the qualitative text analysis, discussing any existing empirical evidence about misconduct as it relates to different models of peer review, and improving field inclusiveness by avoiding conflation of ā€œresearchā€ and ā€œscientific research.ā€

      The reviewers and I agree that the article is a valuable contribution to the metaresearch literature related to peer review processes.

      Handling Editor: Kathryn Zeiler

      Competing interest: I am co-Editor-in-Chief of MetaROR working with Ludo Waltman, a co-author of the article and co-Editor-in-Chief of MetaROR

    Reviewed by MetaROR, PREreview

    This article has 4 evaluationsAppears in 4 listsLatest version Latest activity
  3. Health and medical researchers are willing to trade their results for journal prestige: results from a discrete choice experiment

    This article has 6 authors:
    1. Natalia Gonzalez Bohorquez
    2. Sucharitha Weerasuriya
    3. David Brain
    4. Sameera Senanayake
    5. Sanjeewa Kularatna
    6. Adrian Barnett
    This article has been curated by 1 group:
    • Curated by MetaROR

      In this article the authors use a discrete choice experiment to study how health and medical researchers decide where to publish their research, showing the importance of impact factors in these decisions. The article has been reviewed by two reviewers. The reviewers consider the work to be robust, interesting, and clearly written. The reviewers have some suggestions for improvements. One suggestion is to emphasize more strongly that the study focuses on the health and medical sciences and to reflect on the extent to which the results may generalize to other fields. Another suggestion is to strengthen the embedding of the article in the literature. Reviewer 2 also suggests to extend the discussion of the sample selection and to address in more detail the question of why impact factors still persist.

      Competing interest: Ludo Waltman is Editor-in-Chief of MetaROR working with Adrian Barnett, a co-author of the article and a member of the editorial team of MetaROR.

    Reviewed by MetaROR

    This article has 3 evaluationsAppears in 1 listLatest version Latest activity