Skip to content

Advertisement

You're viewing the new version of our site. Please leave us feedback.

Learn more
Open Access

Reporting guidelines for realist evaluations seek to improve clarity and transparency

BMC Medicine201614:109

https://doi.org/10.1186/s12916-016-0658-7

Received: 8 July 2016

Accepted: 16 July 2016

Published: 21 July 2016

Abstract

An increasing number of realist evaluations are being conducted from a wide range of disciplinary perspectives and with diverse, fit-for-purpose methods. This commentary discusses the recent BMC Medicine publication of RAMESES II reporting guidelines for realist evaluations. Knowledge users such as program implementers and decision-makers will benefit from the increased transparency of reporting and interpretation in light of the totality of evidence encouraged by this guidance. It is hoped that these reporting guidelines will eventually lead to improved knowledge synthesis and contribute to the cumulative science regarding realist evaluation.

Please see related article: https://bmcmedicine.biomedcentral.com/articles/10.1186/s12916-016-0643-1

Keywords

Realist evaluationReporting guidelinesKnowledge synthesis

Background

Realist evaluations are theory-driven evaluations that seek to understand how complex interventions work, for whom they work, and how programs and their effects are influenced by the context and a setting that is rooted in the philosophy discipline [1, 2]. Realist evaluations are carried out from different disciplinary perspectives and using a plurality of methods that are fit for purpose. They can be useful in understanding how a program or policy works, in which settings, and for whom. Realist evaluations can be used to hypothesize whether the program works in different settings and for different participants, including “program designers, implementers, and recipients” [3]. The hypotheses are tested and refined during the evaluation of the program, and this evaluation can be understood using the context-mechanism-outcome (CMO) configuration. Because of this, realist evaluations are appreciated by implementers and decision-makers who seek to understand how a program or policy works, and in which circumstances, when designing or funding programs.

In a recent article in BMC Medicine, Wong and colleagues seek to improve the transparency of reporting realist evaluations by developing consensus and evidence-based reporting guidelines for realist evaluations [3]. They use transparent and accepted methods endorsed by the EQUATOR Network [4] and outlined by Moher and colleagues [5] to develop this guidance. Their protocol was published in BMJ Open [6] and 35 experts with diverse disciplinary backgrounds and experience from six different countries participated in three rounds of a Delphi survey to develop this guidance. A high response was achieved across all rounds of the Delphi (range 76–90 %).

The tool consists of 20 items, which have been broken down into the following six sections: Title, Summary of Abstract, Introduction, Methods, Results, and Discussion. Each of the 20 items included in their reporting realist evaluations tool includes a detailed rationale for each item and exemplars of good practice. They have also built in flexibility in terms of the order of reporting and they strongly encourage authors to document a justification for any variance from the reporting items, including omissions of items. The first item is related to identifying the study as a realist evaluation in the title, which will aid in identifying these types of studies in the future. Items 3 (rationale for evaluation), 4 (program theory), 7 (rationale for realist evaluation), 8 (environment and context), and 10 (evaluation design) are particularly important because, based on evaluations of the reporting of realist reviews [7], these items are more likely to be poorly reported. Items 11 through 13 relate to data collection, the recruitment process, and data analysis. Item 16 on the summary of findings encourages authors to rate the strength of the evidence from the evaluation, which is extremely important for stakeholders who seek to use this information. Item 20 relates to the source of funding and declaring any potential conflicts of interest.

One major advance of this reporting guideline is the encouragement to situate the realist evaluation in the totality of the evidence (item 18), which will help program implementers to interpret the findings in light of other relevant evidence while considering the contribution of differences in settings and populations. This is in keeping with other global initiatives to consider the entirety of the evidence when reporting results of primary studies, such as The Lancet guidelines [8] and the CONSORT Statement [9]. We agree with the authors that situating findings in the light of relevant evidence will contribute to the cumulative evidence base and science regarding other similar programs and policies.

The authors are already promoting the uptake of these reporting standards through the RAMESES listserv, and training workshops and materials, which will assist in their uptake by program evaluators. Also, the authors plan to evaluate the usefulness and impact of these reporting guidelines in the future. An additional activity that could enhance the impact of these reporting guidelines is registration with the EQUATOR Network [4]. Also, the EQUATOR Network provides tools and resources for journal editors to facilitate the use of reporting guidelines by authors.

Conclusions

The RAMESES II reporting guidelines for realist evaluations is an important initiative to ensure these evaluations are reported in sufficient detail, in the context of existing evidence, and with a rating of strength of evidence for main findings that will greatly assist users of the evaluations. Because reviews are only as good as the included studies, in our opinion, this could also eventually improve realist syntheses that include realist evaluations [10]. We look forward to the upcoming development of quality methodological standards for realist evaluations, which will also advance the science of this type of study, as well as likely improve realist synthesis.

Declarations

Acknowledgement

We would like to thank Manosila Yoganathan for her help formatting this commentary.

Funding

Not applicable.

Authors’ contributions

VW and ACT drafted the initial manuscript. Both authors critically reviewed and approved the final manuscript.

Competing interests

VW and ACT declare that they have no competing interests.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Authors’ Affiliations

(1)
Bruyère Research Institute
(2)
School of Epidemiology, Public Health and Preventive Medicine, University of Ottawa
(3)
Li Ka Shing Knowledge Institute, St. Michael’s Hospital
(4)
Epidemiology Division, Dalla Lana School of Public Health, University of Toronto

References

  1. Pawson R, Tilley N. Realistic evaluation. London: Sage; 1997.Google Scholar
  2. Pawson R. The science of evaluation: a realist manifesto. London: Sage; 2013.View ArticleGoogle Scholar
  3. Wong G, Westhorp G, Manzano A, Greenhalgh J, Jagosh J, Greenhalgh T. RAMESES II reporting standards for realist evaluations. BMC Med. 2016;14(1):96.View ArticlePubMedPubMed CentralGoogle Scholar
  4. EQUATOR Network [http://www.equator-network.org/]. Accessed 18 Jul 2016.
  5. Moher D, Schulz KF, Simera I, Altman DG. Guidance for developers of health research reporting guidelines. PLoS Med. 2010;7(2), e1000217.View ArticlePubMedPubMed CentralGoogle Scholar
  6. Greenhalgh T, Wong G, Jagosh J, Greenhalgh J, Manzano A, Westhorp G, Pawson R. Protocol--the RAMESES II study: developing guidance and reporting standards for realist evaluation. BMJ Open. 2015;5(8), e008567.View ArticlePubMedPubMed CentralGoogle Scholar
  7. Tricco AC, Soobiah C, Antony J, Cogo E, MacDonald H, Lillie E, Tran J, D’Souza J, Hui W, Perrier L. A scoping review identifies multiple emerging knowledge synthesis methods, but few studies operationalize the method. J Clin Epidemiol. 2016;73:19–28.View ArticlePubMedGoogle Scholar
  8. Clark S, Horton R. Putting research into context-revisited. Lancet. 2010;376(9734):10–1.View ArticlePubMedGoogle Scholar
  9. Schulz KF, Altman DG, Moher D. CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials. J Clin Epidemiol. 2010;63(8):834–40.View ArticlePubMedGoogle Scholar
  10. Wong G, Greenhalgh T, Westhorp G, Buckingham J, Pawson R. RAMESES publication standards: realist syntheses. BMC Med. 2013;11:21.View ArticlePubMedPubMed CentralGoogle Scholar

Copyright

© The Author(s). 2016

Advertisement