Open Access
Open Peer Review

This article has Open Peer Review reports available.

How does Open Peer Review work?

Optimal strategies to consider when peer reviewing a systematic review and meta-analysis

BMC Medicine201513:274

https://doi.org/10.1186/s12916-015-0509-y

Received: 10 July 2015

Accepted: 23 September 2015

Published: 2 November 2015

Abstract

Systematic reviews are popular. A recent estimate indicates that 11 new systematic reviews are published daily. Nevertheless, evidence indicates that the quality of reporting of systematic reviews is not optimal. One likely reason is that the authors’ reports have received inadequate peer review. There are now many different types of systematic reviews and peer reviewing them can be enhanced by using a reporting guideline to supplement whatever template the journal editors have asked you, as a peer reviewer, to use. Additionally, keeping up with the current literature, whether as a content expert or being aware of advances in systematic review methods is likely be make for a more comprehensive and effective peer review. Providing a brief summary of what the systematic review has reported is an important first step in the peer review process (and not performed frequently enough). At its core, it provides the authors with some sense of what the peer reviewer believes was performed (Methods) and found (Results). Importantly, it also provides clarity regarding any potential problems in the methods, including statistical approaches for meta-analysis, results, and interpretation of the systematic review, for which the peer reviewer can seek explanations from the authors; these clarifications are best presented as questions to the authors.

Keywords

Meta-analysis Peer review PRISMA Reporting guidelines Systematic review The EQUATOR Network

Introduction

Two days ago you received an electronic correspondence from an editor of a journal inviting you to peer review a systematic review and meta-analysis (henceforth called systematic review). The editor asked you to let the journal know whether you will agree to the invitation or decline it in a couple of days. The request for the quick response is typically an effort by the journal to keep the peer review process to as short a time period as possible. Having reviewed the abstract, and author list, to ensure no obvious conflict(s) of interest, you have agreed to the peer review request.

If you are an experienced peer reviewer you will be familiar with the process required to successfully complete your review. However, if you have limited experience peer reviewing there are some excellent resources, such as those discussed in Patel’s recent blog series on peer review [13]. Additionally, reviewers may wish to consider discussing the process with a more senior colleague and/or mentor; however, if help is received to complete the review, it is important that editors are informed about these details.

What to consider when invited to peer review a systematic review

Peer reviewers should initially assess whether the journal has approached them for an opinion as a content expert, methodologist, or both, or perhaps this was not indicated in the editor’s request. If the reviewer’s area of expertise is methodology, it is probably best to minimize the time spent commenting on the content area of the systematic review. Content experts are often formally trained in methodology through, for example, a Masters degree in clinical epidemiology, and can provide excellent content and methodological feedback.

Reports of systematic reviews are typically longer and include more tables, figures, and appendices than other types of research articles. A general plea to authors submitting systematic reviews for consideration of publication is the use of continuous line numbering in the manuscript to facilitate easier peer reviewing; it allows the reviewer to specify the exact line(s) in the manuscript to which comments refer to and saves the reviewer time counting the lines in the manuscript.

A further immediate point to assess is whether the journal has provided a template to complete the peer review report. Even if this is the case, supplementing this with a reporting guideline checklist (see below for information about the EQUATOR Network) is highly recommended. Examining the checklist items provides a memory jog for the reviewer to assess the degree to which the authors have reported on each of these and to seek clarification where there is ambiguity, incomplete reporting, or regarding other pre/clinical content and/or methodological concerns. Many journals, including BMC Medicine, recommend using reporting guidelines as part of the peer review process. There is emerging evidence to support this guidance. Cobo et al. [4] have shown that the use of a reporting guideline during the peer review process improves the quality of the final publication. A comprehensive list of reporting guidelines can be found on the EQUATOR Network’s library of reporting guidelines [5]. Patel also discusses EQUATOR in her second blog on peer review [2]. For systematic reviews, several reporting guideline options are available (Table 1).
Table 1

Examples of reporting guidelines available for peer reviewers assessing systematic reviews

Type of systematic review

Helpful reporting guideline for peer reviewing

Systematic reviews of randomized controlled trials evaluating healthcare interventions

The Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) [22]; hundreds of biomedical journals endorse PRISMA

Systematic reviews of observational studies

Meta-analysis of observational studies in epidemiology (MOOSE) [23]

Systematic reviews involving psychological interventions

The Meta-Analysis Reporting Standards (MARS) guidance [24]; The American Psychological Association endorses MARS.

Synthesis of qualitative studies

The ENhancing Transparency in REporting the synthesis of Qualitative research (ENTREQ) [25]

Realist reviews

Realist And Meta-narrative Evidence Synthesis – Evolving Standards (RAMESES) for reviewing meta-narrative reviews and realist reviews [26, 27]

A tip for peer reviewers

The checklist items covered by PRISMA are relevant to any systematic review, not just those summarizing the benefits and harms of a healthcare intervention, although some adjustments should be considered. For example, assessing the risk of bias is a key concept, but the items used to assess this in a diagnostic review are likely to focus on issues such as the spectrum of patients and the verification of disease status, which differ from reviews of interventions.

The PRISMA Statement includes a lengthy explanation and elaboration document explaining to authors why specific information should be included in the report of a systematic review [6]. This paper can also be helpful for peer reviewers. However, since its publication in 2009, the PRISMA Statement has not yet been updated and peer reviewers should be aware of the new and emerging literature regarding methods and specific content, for both clinical and pre-clinical research. Such knowledge adds to the breadth and depth of a peer review. For example, in 2010, Sampson et al. [7] developed the Peer Review of Electronic Search Strategies (PRESS) to help reduce errors found in published search strategies, a fundamental step in the systematic review process. Another tip for peer reviewers: it is important to assess whether the authors have indicated if their search strategy underwent any type of peer review and seek clarification if this information cannot be found in the manuscript.

A general suggestion is to keep the peer review report to one, single spaced, double-sided page, which equates to approximately 1000 words, sufficient to provide a thoughtful, meaningful, and concise review. There is little evidence as to whether lengthy peer review is more useful than shorter, more concise reports. Another strong suggestion is to keep the peer review as evidence-based as possible [8]; As a peer reviewer it is important to be as complete and transparent in the feedback to the authors, ideally supporting comments with the best available evidence, whenever possible. For example, if the authors of a systematic review do not appear to have assessed for the risk of bias in the included studies, this is problematic as there is strong evidence that inadequate reporting of randomization can exaggerate the estimates of treatment effect [9, 10]. As a peer reviewer, it is appropriate to ask the authors for clarification as to why they did not assess for risk of bias, providing them with an explanation and including references as to why this is important in the conduct of a systematic review. A final general recommendation is keep personal opinion about the systematic review to a minimum, and to never be rude in any comments to the authors.

Completing the peer review

A peer review has good face validity when it starts with a summary of the systematic review [2]. This can be achieved in a short paragraph summarizing what the authors performed (Methods) and found (Results). The précis gives the authors (and editor) a sense that the reviewer has correctly summarized the most salient aspects of the systematic review. Following this, it is often helpful to continue the peer review report mentioning positive aspects of the manuscript. For example, perhaps the authors have addressed a very topical issue, such as the health effects of sugar-sweetened beverages, or the eligibility criteria for studies considered for the review is well thought through and documented. There is little benefit in being negative about everything in the manuscript. In general, authors make a substantial effort to report their review and may have spent considerable time on their manuscript preparation.

The next part of the peer review report should focus on any major problem(s) detected, including fatal flaw(s). For example, the authors may have not reported on the completion of an electronic search for potentially eligible articles and instead relied on studies contained in their personal files, or they may have not reported on the performance of any risk of bias assessments of the included studies. It is often useful to pose these problems as questions for clarification to the authors, since the authors perhaps simply forgot to report them. Alternatively, they may not have considered the preclinical or clinical issues or methods enquired about by the reviewer. There is considerable evidence that the completeness for reporting of most published research, including systematic reviews, is not optimal [11]. Effective peer reviewers need to be as familiar as possible with recent advances in the literature related to their area of expertise. For example, perhaps the study is examining the effectiveness of yoga versus pharmacotherapy for the management of mild to moderate depression in teenagers. As a content expert in depression, the reviewer may be aware of two other recently published reviews addressing the same question, yet the authors have not cited these studies or provided a rationale as to why their study was undertaken; an effective peer reviewer should seek clarification from the authors on these points.

Selective reporting of outcomes is a serious and prevalent problem in clinical and pre-clinical research, including systematic reviews [1214]. The optimal method for peer reviewers to examine for outcome reporting bias is to compare the outcomes reported in the completed review against those documented in the protocol. Peer reviewers can do this in different ways. If the report included a PROSPERO [15] registration number, it is possible to examine the registration entry against the completed report. Alternatively, an increasing number of systematic review protocols are being published in journals, such as Systematic Reviews and BMJ Open, and referenced in the completed review. Examining the published protocol against the completed review provides another opportunity to assess for potential reporting biases.

Approximately half of systematic reviews include at least one meta-analysis. Some of these analyses have statistical issues which could have likely been detected during peer review. These systematic reviews require special attention and should have a separate section in the peer review report. For example, have the authors indicated the statistical approach used? Do they provide confidence intervals along with reporting of any point estimates? Have they assessed for the presence of publication bias, and, if so, have they reported using a graphical method only (funnel plot) and a statistical test (e.g. Egger test)?

Reports of systematic reviews can present with several inherent issues. Often, peer reviewers may request numerous substantial modifications to the systematic review, such as modification of the eligibility criteria to include studies with a different dose of pharmacotherapy or an additional study design. It is not always useful asking authors to what amounts to conducting a new systematic review. Peer reviewers should remember that they have been asked to peer review a completed systematic review. Indeed, such issues might be marker for recommending rejection (confidentially to the editor).

Once comments regarding any major concerns with the systematic review have been presented, other relevant but less serious issues should be indicated. For example, the review authors have indicated the completion of a meta-analysis although there is no indication as to whether they used a fixed-effects, random-effects, or Bayesian approach. Seeking clarification regarding this point would help interested readers replicate the analytical methods used by the authors.

Sometimes the discussion section of the systematic review goes beyond the results or the results are interpreted too optimistically; this is termed spin and has been noted in systematic reviews [16]. An effective peer reviewer will assess the discussion section of the review with particular reference to any possibility of spin, seeking clarification from the authors whenever it is suspected. Peer reviewers should also assess the discussion section to see whether the authors have commented on limitations of the review itself, including those of the included studies.

Apart from traditional systematic reviews there are now many other types of reviews, such as rapid reviews, scoping reviews, individual participant data meta-analyses, and network meta-analyses [17]. Some of these reviews often require a deeper knowledge and understanding of statistics. Similarly, an increasing number of systematic reviews include mixed methods approaches. Peer reviewers should not feel obliged to take on the task of peer reviewing these reports unless they are familiar with the methods. The PRISMA Statement has been extended to include some of these designs [18, 19] and includes relevant checklists that can facilitate the peer review process.

Finally, when invited to peer review a protocol of a systematic review, reviewers might find the PRISMA-P checklist helpful [20, 21]. Information and access to all of these checklists can be found at the EQUATOR Network’s comprehensive library of reporting guidelines [5].

Conclusions

Systematic reviews come in all types of shapes and sizes; there are several reporting guidelines that can facilitate the peer review process. Peer reviewers should ensure that they provide the authors with a brief summary of their report, followed by a review of any fatal/major flaws detected or any other concerns; posing these concerns as questions and clarifications to the authors is far better than being accusatory or rude. Peer reviewers play an important role in helping ensure that published systematic reviews are of the highest possible quality. In practical terms, this means that the systematic review should be completely and transparently reported and that the methods can be reproduced by interested readers.

Declarations

Acknowledgements

Thanks to Drs. Alam, Dickson, and Singh for their thoughtful comments on an earlier version of this paper.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Authors’ Affiliations

(1)
Clinical Epidemiology Program, Ottawa Hospital Research Institute
(2)
School of Epidemiology, Public Health and Preventive Medicine, University of Ottawa
(3)
Canadian EQUATOR Centre

References

  1. Patel J. BioMed Central Blog. A beginner’s guide to peer review: Part One. 2015. http://blogs.biomedcentral.com/bmcblog/2015/05/13/beginners-guide-peer-review-part-one/. 22nd September 2015
  2. Patel J. BioMed Central Blog. A beginner’s guide to peer review: Part Two. 2015. http://blogs.biomedcentral.com/bmcblog/2015/06/08/beginners-guide-peer-review-part-two/. 22nd September 2015
  3. Patel J. BioMed Central Blog. A beginner’s guide to peer review: Part Three. 2015. http://blogs.biomedcentral.com/bmcblog/2015/07/09/beginners-guide-peer-review-part-three/. 22nd September 2015
  4. Cobo E, Cortes J, Ribera JM, Cardellach F, Selva-O'Callaghan A, Kostov B, et al. Effect of using reporting guidelines during peer review on quality of final manuscripts submitted to a biomedical journal: masked randomised trial. BMJ. 2011;343:d6783.PubMed CentralView ArticlePubMedGoogle Scholar
  5. Equator Network. Enhancing the QUAlity and Transparency Of health Research. Library. http://www.equator-network.org/library/.22nd September 2015
  6. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Ann Intern Med. 2009;151:W65–94.View ArticlePubMedGoogle Scholar
  7. Sampson M, McGowan J, Cogo E, Grimshaw J, Moher D, Lefebvre C. An evidence-based practice guideline for the peer review of electronic search strategies. J Clin Epidemiol. 2009;62:944–52.View ArticlePubMedGoogle Scholar
  8. Moher D, Jadad AR. How to peer review a manuscript. In: Jefferson T, Godlee F, editors. Peer review in Health Sciences. 2nd ed. London: BMJ Books; 2003. p. 183.Google Scholar
  9. Schulz KF, Altman DG, Moher D. Allocation concealment in clinical trials. JAMA. 2002;288:2406–7. author reply 2408–9.View ArticlePubMedGoogle Scholar
  10. Savovic J, Jones HE, Altman DG, Harris RJ, Juni P, Pildal J, et al. Influence of reported study design characteristics on intervention effect estimates from randomized, controlled trials. Ann Intern Med. 2012;157:429–38.View ArticlePubMedGoogle Scholar
  11. Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. Reducing waste from incomplete or unusable reports of biomedical research. Lancet. 2014;383:267–76.View ArticlePubMedGoogle Scholar
  12. Silagy CA, Middleton P, Hopewell S. Publishing protocols of systematic reviews: comparing what was done to what was planned. JAMA. 2002;287:2831–4.View ArticlePubMedGoogle Scholar
  13. Kirkham JJ, Dwan KM, Altman DG, Gamble C, Dodd S, Smyth R, et al. The impact of outcome reporting bias in randomised controlled trials on a cohort of systematic reviews. BMJ. 2010;340:c365.View ArticlePubMedGoogle Scholar
  14. Moher D, Avey M, Antes G, Altman DG. The National Institutes of Health and guidance for reporting preclinical research. BMC Med. 2015;13:34.PubMed CentralView ArticlePubMedGoogle Scholar
  15. Booth A, Clarke M, Dooley G, Ghersi D, Moher D, Petticrew M, et al. The nuts and bolts of PROSPERO: an international prospective register of systematic reviews. Syst Rev. 2012;1:2.PubMed CentralView ArticlePubMedGoogle Scholar
  16. Dunn AG, Arachi D, Hudgins J, Tsafnat G, Coiera E, Bourgeois FT. Financial conflicts of interest and conclusions about neuraminidase inhibitors for influenza: an analysis of systematic reviews. Ann Intern Med. 2014;161:513–8.View ArticlePubMedGoogle Scholar
  17. Tricco AC, Tetzlaff J, Moher D. The art and science of knowledge synthesis. J Clin Epidemiol. 2011;64:11–20.View ArticlePubMedGoogle Scholar
  18. Stewart LA, Clarke M, Rovers M, Riley RD, Simmonds M, Stewart G, et al. Preferred Reporting Items for Systematic Review and Meta-Analyses of individual participant data: the PRISMA-IPD Statement. JAMA. 2015;313:1657–65.View ArticlePubMedGoogle Scholar
  19. Cornell JE. The PRISMA extension for network meta-analysis: bringing clarity and guidance to the reporting of systematic reviews incorporating network meta-analyses. Ann Intern Med. 2015;162:797–8.View ArticlePubMedGoogle Scholar
  20. Moher D, Shamseer L, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst Rev. 2015;4:1.PubMed CentralView ArticlePubMedGoogle Scholar
  21. Shamseer L, Moher D, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ. 2015;349:g7647.View ArticlePubMedGoogle Scholar
  22. Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151:264–9. W64.View ArticlePubMedGoogle Scholar
  23. Stroup DF, Berlin JA, Morton SC, Olkin I, Williamson GD, Rennie D, et al. Meta-analysis of observational studies in epidemiology: a proposal for reporting. Meta-analysis Of Observational Studies in Epidemiology (MOOSE) group. JAMA. 2000;283:2008–12.View ArticlePubMedGoogle Scholar
  24. APA Publications and Communications Board Working Group on Journal Article Reporting Standards. Reporting standards for research in psychology: why do we need them? What might they be? Am Psychol. 2008;63:839–51.PubMed CentralView ArticleGoogle Scholar
  25. Tong A, Flemming K, McInnes E, Oliver S, Craig J. Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Med Res Methodol. 2012;12:181.PubMed CentralView ArticlePubMedGoogle Scholar
  26. Wong G, Greenhalgh T, Westhorp G, Buckingham J, Pawson R. RAMESES publication standards: realist syntheses. BMC Med. 2013;11:21.PubMed CentralView ArticlePubMedGoogle Scholar
  27. The RAMESES Projects. http://www.ramesesproject.org/index.php?pr=Home_Page. Accessed 22 September 2015.

Copyright

© Moher. 2015