Global collaborative networks on meta-analyses of randomized trials published in high impact factor medical journals: a social network analysis

Background Research collaboration contributes to the advancement of knowledge by exploiting the results of scientific efforts more efficiently, but the global patterns of collaboration on meta-analysis are unknown. The purpose of this research was to describe and characterize the global collaborative patterns in meta-analyses of randomized trials published in high impact factor medical journals over the past three decades. Methods This was a cross-sectional, social network analysis. We searched PubMed for relevant meta-analyses of randomized trials published up to December 2012. We selected meta-analyses (including at least randomized trials as primary evidence source) published in the top seven high impact factor general medical journals (according to Journal Citation Reports 2011): The New England Journal of Medicine, The Lancet, the BMJ, JAMA, Annals of Internal Medicine, Archives of Internal Medicine (now renamed JAMA Internal Medicine), and PLoS Medicine. Opinion articles, conceptual papers, narrative reviews, reviews without meta-analysis, reviews of reviews, and other study designs were excluded. Results Overall, we included 736 meta-analyses, in which 3,178 authors, 891 institutions, and 51 countries participated. The BMJ was the journal that published the greatest number of articles (39%), followed by The Lancet (18%), JAMA (15%) and the Archives of Internal Medicine (15%). The USA, the UK, and Canada headed the absolute global productivity ranking in number of papers. The 64 authors and the 39 institutions with the highest publication rates were identified. We also found 82 clusters of authors (one group with 55 members and one group with 54 members) and 19 clusters of institutions (one major group with 76 members). The most prolific authors were mainly affiliated with the University of Oxford (UK), McMaster University (Canada), and the University of Bern (Switzerland). Conclusions Our analysis identified networks of authors, institutions and countries publishing meta-analyses of randomized trials in high impact medical journals. This valuable information may be used to strengthen scientific capacity for collaboration and to help to promote a global agenda for future research of excellence.


Background
The past decades have seen the establishment of evidence synthesis, particularly systematic reviews and meta-analyses, as a key component of evidence based medicine (EBM) [1,2]. Meta-analyses of randomized trials have become more widely accepted by clinicians, researchers and policy makers as a useful tool to critically assess the totality of evidence in a research question. When performed well and reported completely, incorporating explicit and detailed methods and results, such studies produce information that can have undoubtedly major, immediate effects on medical practice, research agendas and the establishment of healthcare policies.
Important milestones that may have encouraged research in this field, from the point of view of scientific publications and institutional development of EBM [1][2][3][4], include the creation of international research groups, centers, and consortia (such as the Centre for Evidence Based Medicine and The Cochrane Collaboration in the 1990s) in addition to groups developing reporting guidelines to ensure articles contain all essential information, such as QUOROM (Quality of Reporting of Meta-analyses) [5] and, more recently, PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) [6].
Global health challenges require research collaboration and multi-lateral programs on a global scale, owing to the nature and magnitude of the public health problems. Noteworthy examples include major environmental, political, and social determinants of health, as well as complex and changing clinical issues related to the conditions and risk factors that cause the highest burden of disease around the world [7][8][9]. Despite continuous efforts of individual scientists and institutions to remedy deficiencies in healthcare effectiveness and safety, multiple gaps and disparities remain. Research collaboration contributes to the advancement of knowledge by exploiting the results of scientific efforts more efficiently, but the global patterns of collaboration on meta-analysis are unknown. Given that metaanalyses can provide high-quality clinical evidence regarding the robustness of the effects of healthcare interventions to inform medical practice, there is an urgent need to evaluate and promote scientific activity and growth in the field of EBM [4,[10][11][12].
Social network analysis [13], the study of structure derived from the regularities in the patterning of relationships between social entities (which might be people or organizations), is grounded in the assessment of empirical data, and can provide an appropriate approach to identify top scientists and researchers, groups of excellence, and leading institutions. It also offers information to assess the citation patterns among papers within a specialty [14], to identify gaps in the evidence from scientific research [15], and to understand the structure and nature of relationships and interactions within a scientific community that collaborate to better achieve common or compatible goals [13,16].
This study aimed to describe and characterize global collaborative patterns with regard to the conduct of metaanalyses of randomized trials published over the past three decades in high impact factor medical journals, by applying techniques from social network analysis.

Design and sample
In December 2012, we searched for reports of metaanalyses of randomized trials that were indexed in PubMed and published in one of the top seven high impact general medical journals, as identified in 2011, based on an impact factor of at least 10  . We also performed complementary hand-searches and reviewed references of identified eligible reports to identify additional meta-analyses.
We included two types of articles from the eligible journals: original research reports and reviews (both incorporating meta-analyses of randomized trials). Editorials, commentaries, and other opinion articles were excluded. We also excluded conceptual papers, literature (narrative) reviews, reviews of reviews, meta-analysis of observational studies not considering randomized trials, single randomized trials, and other study designs (such as cost-effectiveness analyses and epidemiological studies).
For the purposes of this study, we selected all articles published in English and indexed in PubMed between January 1985 and December 2012. One researcher with expertise in evidence synthesis (FC-L) screened the titles and abstracts, and identified all potentially eligible articles. The same researcher excluded the articles not meeting the pre-specified criteria.

Data extraction
For each included paper, we extracted information on the year of publication, the journal title, and the authors' names, institutional affiliation(s), and country of origin. This information was downloaded online through the Science Citation Index-Expanded (SCI-E) Web of Knowledge platform version 5, in April 2013. The Web of Knowledge platform is a database that contains all the above information, including the full addresses of all authors of every paper. We also used the SCI-E to determine the extent to which each study had been cited in the scientific peer-review literature using the 'times cited' number (that is, the number of times a publication has been cited by other publications). A process of standardization was conducted to bring together the different names of a particular author or institution. Specifically, one researcher (AA-A) checked the names by which an individual author appeared in two or more different forms (for example, 'Gordon Guyatt' or 'Gordon H Guyatt'), using coincidence in that author's place(s) of work as the basic criterion for normalization (for example, McMaster University, Canada).
In the case of institutions, we unified the different variants to match the name recorded in public directories of institutions. Similarly, given that institutional names in many records included two or more institutions (for example, university hospitals, research centers and academic institutions), we proceeded to distinguish between these names by recording all variations of any individual macroinstitution as could be identified for each bibliographic record (for example, for the institutional address 'Reproductive Medicine Unit, Department of Obstetrics & Gynaecology, University of Adelaide, Queen Elizabeth Hospital, Australia' , the standardization approach was to present 'University of Adelaide, Australia' separately from 'Queen Elizabeth Hospital, Australia'). With all this information, we constructed a Microsoft Access database.

Data analysis
In this paper, we use the term 'co-authorship' to refer to joint authorship of a scientific paper by at least two individuals, and the term 'institutional collaboration' to refer to joint authorship by different institutions. 'Intensity of collaboration or threshold' refers to the number used to form clusters of authors and institutions (that is, the frequency of co-authorship between pairs of authors or of collaboration between institutions), and reflects a criterion to label identifiable clusters as research groups. Collaboration between authors (or institutions) was portrayed by calculating the number of papers, names, signatures and collaborations, the index of signatures per paper or collaboration index (which is the mean number of signatures per paper), and the index of authors per paper (mean number of authors per paper, considering only the different authors). A summary box with definitions of each of the measurements of collaboration is provided in the supplementary material (see Additional file 1: 'Definitions of collaborative measurements').
To construct co-authorship networks, we identified all combinations of pairs of authors for each paper. The number of co-authorships for each paper is related to the number of authors as is equal to where m is the number of individual authors and n the number of elements in the groups constructed. Once co-authorship was quantified, we further established an a posteriori threshold of two or more collaborations between pairs of authors, in order to reduce the number of nodes and links that would prevent a clear view of the network, and thus center the analysis on the more intense co-authorship relationships. The same approach was applied to institutional and country authorship to construct the network of collaborations, although in this case, we applied an a posteriori threshold of three or more papers signed in co-authorship. The productivity and patterns of collaboration by author, institution and country were analyzed. We used PAJEK [17], a software package for large network analysis that is free for non-commercial use, to analyze indicators and construct social networks.

Number of meta-analyses
The PubMed search generated 804 records. Following screening of abstracts and full text articles, 724 publications were retained, and 12 additional publications were added from complementary searches of reference lists, thereby yielding a final sample of 736 included meta-analyses. The process of study selection is presented in Figure 1.  (Table 1). Approximately three-quarters of the meta-analyses were reported during the most recent decade.
The included meta-analyses had a median of 5 authors, although 41 (6%) were single authored (Table 1). More than a quarter (214 [29%]) of the first authors were from the USA, with three countries (the USA, the UK and Canada) accounting for more than two-thirds of the meta-analyses published during the period of analysis (Table 1).

Production and collaboration patterns
Overall, 3,178 authors, 891 institutions and 51 countries worldwide were involved in the sample of articles. We identified 64 authors who published 5 or more papers ( Table 2). The most prolific authors were Lau (15 papers), Guyatt (14), Peto (13), Yusuf (12), Cook (11) and Jüni (11). Many of the most prolific authors are affiliated with only a few academic institutions and/or medical centers; six are affiliated with the University of Oxford (Peto, Collins, Clarke, Baigent, Gray and Rothwell), five are affiliated with McMaster University (Guyatt, Yusuf, Cook, Douketis, and Eikelboom) and five are affiliated with the University of Bern and/or the Inselspital -Bern University Hospital (Jüni, Trelle, Egger, Reichenbach and Nüesch). Applying a threshold of 2 or more papers published as coauthors (Figures 2, 3, 4, 5, 6, 7), we identified 82 clusters of authors. Of these, 12 were identified as major coauthorship groups (1 with 55 members, 1 with 54 members, 1 with 27 members, 1 with 15, 4 with 14 members, 3 with 11 members, 1 with 10 members, 1 with 8 members, 5 with 7 members, 5 with 6 members and 3 with 5 members).
Institutional productivity was headed by McMaster University (49 papers), University of Oxford (48 papers) and Harvard University (36 papers) ( Table 3). Next came some of their affiliated hospitals or medical centers (Brigham and Women's Hospital and the Radcliffe Infirmary, with 32 and 30, respectively). Applying a collaboration threshold of at least 3 papers signed with inter-institutional collaboration, we identified 19 clusters comprising a total of 120 institutions (Figures 8 and 9). Of these, the most important institutional cluster comprised 76 members.
The productivity ranking for countries with respect to the number of papers (Table 4) was headed by the USA (310 papers), the UK (297 papers) and Canada (143 papers). After these countries came Australia (70 papers), and Italy and the Netherlands (57 papers each). The USA and the UK also headed the list of the number of different countries with which they had collaborated, as well as the total number of collaborations. Figure 10 shows a visual representation of the collaborative network between countries, in which we can see the relationships of some with respect to others and the position that each occupies in the network as a whole.
The 75 most cited articles by number of citations are listed in the supplementary material (see Additional file 2: 'List of most cited meta-analyses'). Heavily cited metaanalyses include randomized trials examining the health effects of pharmacological interventions in cardiology and oncology (for example, antithrombotic trials, anti-platelet trials, antihypertensive trials, lipid-lowering trials, chemotherapy for diverse cancers such as breast cancer, lung cancer, or head and neck cancers). Information refers to the total citations received by all the papers published in a given journal (for example, 289 papers published in BMJ received a total of 37,930 citations). c Information was incomplete for some records. We used country data of the first author in 704 out of 736 papers, while for 32 papers this information was retrieved from the correspondence address.

Discussion
Generally, the visibility and recognition of scientific research activities benefits from increasing collaborative research [18]. Research collaboration plays an important role in science, policy and medicine [19,20]. Research collaboration in the 'Big Science' era involves addressing important and relevant research questions that require a complex construction of multi-disciplinary teams of scientists and researchers, large-scale scientific structures, budgets of an unprecedented scale, and widespread sharing of scientific knowledge and data. Thus, meta-analysis can be considered a good example of 'Big Science' in medicine  and clinical epidemiology [21], as quantitative evidence synthesis is the application, in practice, of the principle that science is cumulative [22,23]. An obvious manifestation of this is the observed trend of 75 new randomized trials and 11 new systematic reviews being published daily, with a plateau in this growth not yet reached [24].
Therefore, promoting research collaboration in evidence synthesis is able to strengthen research activity, productivity and impact. In general, we found a strong clustering of papers published in two British journals (BMJ and The Lancet accounted for 57% of meta-analyses), in contrast to other  general medical journals (for example, the NEJM represented less than 2%). We hypothesized that these different findings between journals may potentially reflect an editorial policy and/or preference, with the BMJ, The Lancet and JAMA journals specifically being more interested in and/or promoting the publication of high-quality quantitative evidence synthesis.
Perhaps a relevant finding is that collaborative networks are expanding in multiple regions, revealing a discernable and well-established scientific community, with the most Figure 5 Co-authorship networks. Main clusters of authors (≤ 14 members), applying a threshold of two or more papers signed in co-authorship. prolific authors and institutions having an important number of collaborations. As might be expected, the scientific community captured by the networks is centered on a nucleus of scientists and researchers from academia, medical centers and health research institutes from western high-income countries (North America, Western Europe and Australia/Oceania). Specifically, the most intense global collaborations took place between authors and institutions from the USA, the UK and Canada. However, although these three countries lead in the number of published high impact meta-analyses, the efforts during the period of study were global, with publications from authors and institutions in more than 50 different countries. Cultural links may have historically benefited some countries through alliances with nations and regions that speak the same language (as may be the case for the UK through alliances with Commonwealth countries that speak English) and have adopted similar scientific and research structures [25]. However, there is a clear over-representation of scientists based in western high-income countries, and the limited participation of low and middle incomebased researchers could warrant further pragmatic action. Given that research resources and funding are often restricted, it is the responsibility of the scientific community to utilize the resources available most efficiently when exploring research priorities to afford the health needs of the population, stimulating north-south and west-east collaborations where possible. In fact, these results are consistent with those reported by Uthman et al. [26], who assessed the characteristics of the 100 most frequently cited meta-analysis related articles. Although the scope of our research is definitely different from that paper, those authors also showed that the USA, the UK and Canada have taken leadership in the production of citation papers, but no first author from low or middle income countries led one of the most cited papers.
The maps of scientific partnership show that authors who are 'leaders' and thus who may contribute collaboration, have more frequent and intense collaboration between other authors and institutions from different countries. The study also identifies highly cohesive cluster networks and provides considerable information on the structure that can be put to various purposes, such as funding agencies designing strategies for future scientific collaboration, agencies such as the World Health Organization promoting a global coordinated agenda for perceived high priority clinical topics, and sharing of reliable and innovative methodologies that can be linked to world-class educational and training opportunities.
There are several possible explanations for our findings. The use of modern communication and information technologies, especially the Internet, has diminished the role of geographical and territorial boundaries in the access and transmissibility of information [27]. This has enabled scientists, and particularly systematic reviewers, closer internalization of research and collaboration. Similarly, the creation of some international collaborations, including those conducting clinical trials, may have settled the groundwork for the subsequent realization of collaborative meta-analyses that may have a clear scientific and clinical impact. For example, according to SCI-E, the most cited meta-analysis article has received more than 2,500 citations; Figure 7 Co-authorship networks. Main clusters of authors (≤ 6 members), applying a threshold of two or more papers signed in co-authorship.   [29][30][31].
Collaborative networks, an important form of social network analysis, have been intensively studied in many scientific disciplines, including biology, physics, medicine and economics [13,16,[32][33][34][35][36]. To our knowledge, no study has previously described and characterized the global   There are several limitations to our study. First, although the scientific production analysed has been drawn from an exhaustive analysis of the literature, it is possible that the search missed some relevant articles. Furthermore, some reports were published in journals without being indexed as meta-analyses, making them difficult to identify. The analyses inevitably represent an initial investigation, and a more detailed exploration is also needed. In addition, we restricted our analysis to meta-analyses that considered randomized trials as the primary source of clinical evidence, and therefore there may be scientists and researchers (or institutions) who do not appear because their papers are not reflected in the collaborative networks (for example, genetic epidemiology). It would be interesting to explore  whether the use of alternative sources (such as observational research, descriptive epidemiology, molecular genetics, non-clinical studies) resulted in similar results to those reported here. Our analysis was also limited in scope, focusing only on original research and reviews articles. Undoubtedly, there are other important reports (for example, methodological [37][38][39] and conceptual papers [6,40,41]) that also merit consideration. Second, we excluded the Cochrane Library, specifically, the Cochrane Database of Systematic Reviews, a major source of systematic reviews. However, to date its impact factor is smaller than that of any of the included journals. Given the dynamic nature of the field, other opportunities for further research include examining the evolution of the identified networks over time (for example, by means of longitudinal analysis) also considering papers published in multi-disciplinary journals or those included in journals belonging to other categories. Additionally, there were no further inquiries or attempts to verify the quality of reporting of the meta-analyses in our sample. Previous research [42] has addressed this issue, pointing out that some of the meta-analyses published in leading medical journals have important methodological limitations. Third, as in many bibliometric analyses, the importance of normalizing the names of scientists, researchers and their institutions is fundamental to avoiding potential errors caused in recognizing variations in the name of a single author. Nevertheless, we conducted a careful manual validation of the bibliographic references to avoid these potential errors. In the case of authors, the criterion followed with two or more variants of a name or surname was to check the coincidence of the different variants with the workplace. As discussed elsewhere [34], this procedure does not assure complete certainty. It does not take into account possible changes in the author's workplace, nor does it avoid the problem where the same bibliographic name refers to the scientific production of two or more authors, although the fact that a single field and a short chronologic period were being analyzed helped to minimize this kind of error. For institutional names, the main problem is that the same name frequently applies to two or more institutions, something that is common for authors who work in institutes or hospitals connected to universities. In such cases, we opted to assign as many names to the macroinstitutions as could be identified. Although this resulted in the problem of multiplying the number of institutions in the recount, it was necessary in order to avoid losing information concerning the macroinstitutions occurring in second place or later in the list of names. The same criterion of multiplying the names was used in the case of the institutes and other research organizations, sometimes administratively dependent on one macroinstitution, the result being that a 'fictitious' inter-institutional collaboration may have been obtained.

Conclusions
Our study identified the most significant collaborative networks of authors, institutions and countries publishing meta-analyses of randomized trials in high impact medical journals. This information may be used to strengthen scientific capacity for global collaboration and help to build a cooperative scientific agenda for future research of excellence in the field of clinical evidence synthesis in a manner similar to how some international clinical trial collaborations have developed. We hope that this analysis will be useful as policy makers, researchers and institutions look to the future.