- Open Access
Communicating health risks in science publications: time for everyone to take responsibility
© The Author(s). 2018
- Received: 12 October 2018
- Accepted: 12 October 2018
- Published: 13 November 2018
Research that is poorly communicated or presented is as potentially damaging as research that is poorly conducted or fraudulent. Recent examples illustrate how the problem often lies with researchers, not press officers or journalists. The quest for publication and ‘impact’ must not outweigh the importance of accurate representation of science; herein, we suggest steps that researchers, journalists and press officers can take to help ensure this.
- Absolute risk
- Relative risk
- Press release
- Practical significance
- Risk communication
Medical researchers are increasingly in search of a newspaper headline; this, coupled with the plethora of traditional and social media outlets hungry for the latest research on a topic ‘relevant’ to their audiences, is proving a match made in heaven. Readership, sales and ‘impact’ all seem to benefit, but do any of us end up at all wiser?
The paper only provided relative risks for serious harm, including a 0.5% increased risk at one drink per day, and gave no estimates of the absolute risks from light drinking. However, the press officers at The Lancet recognised that such reporting was not in line with the journal’s own guidelines  and asked the authors for the absolute figures, which resulted in the following quote: “914 in 100,000 15–95 year olds would develop a condition in one year if they did not drink, but 918 people in 100,000 who drank one alcoholic drink a day would develop an alcohol-related health problem in a year” .
Putting this into perspective, four extra cases in 100,000 means that, for every 25,000 people having one drink per day, only one more person would experience a (serious) alcohol-related condition each year. Since one standard drink containing 10 g of alcohol per day adds up to 3.65 kg a year, equivalent to 16 bottles (70 cl) of 40% ABV gin, this corresponds to 400,000 bottles of gin shared between 25,000 people to give rise to one case of serious harm. If stood in a line, these bottles would stretch for approximately 40 km, about the length of a marathon.
Viewed this way, the authors’ claim that their results should lead public health bodies “to consider recommendations for abstention”  looks weakly supported at best. Thus, a risk that was neither statistically nor practically significant became a major headline story – this hardly seems like trustworthy science communication.
In both of the examples provided above, the messages to the public (namely anyone who did not read the actual papers in minute detail) as well as the calls to policy-makers are only weakly related to the actual research results – how did this occur, and how can we prevent it from doing so again (and again)?
The path from research findings to media headlines is often a tortuous one, fraught with various hazards ; nevertheless, in the two cases presented above, it is possible to backtrack along the decision-making pathway. Journalists were initially alerted to these two stories by press releases. The alcohol risk study press release included the sub-headline “The authors suggest there is no safe level of alcohol” , on which the press chose to focus. The adrenaline study press release stated that “Using adrenaline in cardiac arrests results in less than 1% more people leaving hospital alive – but nearly doubles the survivors’ risk of severe brain damage” , with journalists choosing to literally reproduce the press release. Therefore, should the press officers be held accountable? Did they misinterpret the numbers to ‘spin’ the story? No – the press releases actually quoted the researchers verbatim, with the authors’ own interpretations of the numbers being reported.
These two examples illustrate a seemingly continuing pattern, wherein journalists’ reports are fairly accurately reproduced from the press releases they are given and press officers work hard to clearly and accurately represent their authors’ views. Therefore, much of the responsibility lies with the researchers themselves, perhaps feeling under pressure to maximise the ‘publishability’ of studies.
As the UK Government’s Universal Ethical Code for Scientists states , responsible communication is one of the three key responsibilities in scientific research, for good reason. In 1995, research showing that the third generation contraceptive pill doubled the risk of venous thrombosis generated dramatic headlines in the UK and is thought to have resulted in 10,000 abortions (plus 10,000 births) attributed to women stopping the pill . Research on the pill’s thromboembolic effects did indeed show a potential two-fold increase in the risk, yet the absolute risks had changed from only 1 in 7000 to 2 in 7000  – if these numbers had been provided, it is unlikely that such a ‘scare’ would have ensued. Whilst this example may be the most famous, it is not the only case where poor communication of research has resulted in real-life serious consequences.
Media professionals spend every day assessing the consequences of their words, with most health journalists taking that responsibility very seriously. Conversely, researchers working on long and complex projects may not be accustomed to having to consider such potential outcomes. Nevertheless, journals, press offices, the Science Media Centre, the Academy of Medical Sciences and many responsible researchers have produced guidelines to help the accurate communication of risk. Ultimately, it is the responsibility of every researcher and journal editor to consider the wider effects of what they publish, and to publish data true to the results that they have found.
All those involved in the pipeline of research, publication and publicity have a role in ensuring that risks are clearly presented, putting their magnitude into perspective, without exaggerating their importance, and communicating their uncertainty. We therefore recommend that (a) authors should be able to justify the claims made in their papers and should work closely with press offices in ensuring accurate press releases; (b) journals and peer reviewers enforce guidelines and damp down – rather than encourage – exaggerated claims by authors; (c) press officers ensure that absolute risks are included in press releases and that the conclusions cannot easily be misinterpreted; and (d) journalists demand that researchers put their research claims into perspective.
AF drafted the initial document and DS edited the final text. Both authors read and approved the final manuscript.
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
- Gabbatiss J. No safe level of alcohol consumption, major study concludes. Independent. 2018. Available from: https://www.independent.co.uk/news/health/alcohol-drinking-no-safe-level-health-heart-disease-cancer-study-a8505181.html. [cited 2018 Oct 9]
- Paddock C. The “safest level of drinking is none,” says alcohol study. Med News Today. 2018; Available from: https://www.medicalnewstoday.com/articles/322874.php. [cited 2018 Oct 9].
- GBD 2016 Alcohol Collaborators. Alcohol use and burden for 195 countries and territories, 1990–2016: a systematic analysis for the Global Burden of Disease Study 2016. Lancet. 2018;6736(18):1–21.Google Scholar
- Lancet. Articles systematic reviews and meta-analyses in the Lancet: formatting guidelines. Lancet. 2009:1–4. http://www.download.thelancet.com/pb/assets/raw/Lancet/authors/metaguidelines.pdf.
- The Lancet. The Lancet: Alcohol is associated with 2.8 million deaths each year worldwide. Press release. Available from: https://www.eurekalert.org/pub_releases/2018-08/tl-tla082218.php. [cited 2018 Oct 9]
- Bodkin H. Cardiac arrest resuscitation drug has needlessly brain-damaged thousands. Telegraph. 2018. Available from: https://www.telegraph.co.uk/science/2018/07/18/cardiac-arrest-resuscitation-drug-has-needlessly-brain-damaged/. [cited 2018 Oct 9]
- Lay K. Adrenaline “doubles risk of brain damage” Times 2018. Available from: https://www.thetimes.co.uk/article/adrenaline-doubles-risk-of-brain-damage-sn3xr9tkw. [cited 2018 Oct 9]
- Perkins GD, Ji C, Deakin CD, Quinn T, Nolan JP, Scomparin C, et al. A randomized trial of epinephrine in out-of-hospital cardiac arrest. N Engl J Med. 2018;379:711–21.View ArticleGoogle Scholar
- Spiegelhalter D. Trust in numbers. J R Stat Soc Ser A Stat Soc. 2017;180(4):948–65.View ArticleGoogle Scholar
- University of Warwick. Using adrenaline in cardiac arrests results in less than one percent more people leaving hospital alive but nearly doubles the survivors’ risk of severe brain damage. Press Release. Available from: https://www.eurekalert.org/pub_releases/2018-07/uow-uai071818.php. [cited 2018 Oct 9]
- Government Office for Science. Rigour, Respect and responsibility. 2007.Google Scholar
- Furedi A. The public health implications of the 1995 “pill scare.”. Hum Reprod Update. 1999;5:621–6.Google Scholar
- Gigerenzer G, Gaissmaier W, Kurz-Milcke E, Schwartz LM, Woloshin S. Helping doctors and patients make sense of health statistics: toward an evidence-based society. Psychol Sci Public Interes. 2007;8(2):53–96.View ArticleGoogle Scholar