Connect
MJA
MJA

Mapping Australia's basic research in the medical and health sciences

Paul F Bourke and Linda Butler
Med J Aust 1997; 167 (11): 610-613.
Published online: 8 December 1997

Mapping Australia's basic research in the medical and health sciences

Paul F Bourke and Linda Butler

The Institute for Scientific Information indexes most of the major international basic research journals in science in the Science Citation Index (SCI). Australia's presence in the medical and health sciences journals in the SCI and the citations its published research receives in these journals show that Australia's basic medical research has high international "visibility". Mapping the source of the most highly "visible" Australian medical research articles shows high impact research coming from several different sectors (research institutes, universities, hospitals, etc.), but with a concentration in the member institutions of the Australian Association of Medical Research Institutes (AAMRI). Published research from the AAMRI is cited at a rate two-thirds higher than the Australian average for medical and health sciences. (MJA 1997; 167: 610-613)

 

Introduction - Data source - To what extent is Australian basic research in the medical and health sciences covered by the REPP database? - Where is the research being conducted? - Are patterns of medical authorship changing? - How "visible" is Australian medical research? - Where is Australia's most "visible" medical research being conducted? - Where are Australia's most highly cited medical articles produced? - Discussion - References - Authors' details

- - - ©MJA1997

 

Introduction

A very high proportion of the articles reporting the results of basic research in the medical and health sciences are in the journals indexed by the Institute for Scientific Information (ISI) in the Science Citation Index (SCI). The SCI is therefore an excellent tool for identifying the sources of Australian published research in this field, while the citations received by (or references to) these articles can also be used to identify the source of the most-cited research. We have focused on Australia's basic research output, not its applied research.  

Data source

The Research Evaluation and Policy Project (REPP) at the Australian National University (ANU) has constructed a database of all Australian research published in ISI-indexed journals for the period 1981-1995. This database has been well documented in several of our published studies, and a description is available on the Internet.1,2 Our analysis of this database is based on the addresses shown on the publications, which we have "cleaned" down to the level of university department. This has been done by ensuring all variants of a departmental, faculty or institutional address (sometimes running into hundreds) are grouped together and given the same "standard" address. The database also contains details of the citations in ISI journals received by these Australian research articles; that is, the number of times Australian articles are in the reference list of other ISI journal articles.

We also code the sector -- universities, hospitals, medical research institutes, government institutions, and "other" -- of each address. The medical research institutes sector comprises the 25 members of the Australian Association of Medical Research Institutes (AAMRI). Medical research in the Government sector comes primarily from the Commonwealth Scientific and Industrial Research Organisation (CSIRO) and the State and Federal departments of health. The "other" sector includes industry and non-profit organisations.

We acknowledge "grey areas" at the margins of these assignments to sectors. For example, where a research group based in a hospital with a university connection lists the university in the address, we consistently assign the publication to the university. Another example is the John Curtin School of Medical Research (JCSMR), a full-time medical research institute (though not a member of AAMRI), which is also a research school of the ANU funded from that university's operating grant. Its publications are assigned to the ANU and hence fall within the universities sector.

Most publications can be unambiguously assigned, but some addresses are not precise guides to the source where the research was conducted. This arises particularly for researchers who hold conjoint or adjunct appointments in two institutions. If such an author nominates a single institution, we accept that this is the location of the research leading to that publication. If an author specifies two separate addresses, two records are created for that publication showing the two institutions (double-counting created by this procedure is removed for sectoral and national analysis, but remains when the individual institution is the focus of the analysis). If the address itself has multiple components, the publication can not be split into multiple records and has to be assigned to the apparent primary institution. Very few publications fall into this latter category, and their existence has little effect on the map of national, sectoral or institutional sites of research.

We have used the standard bibliometric practice of ascribing an article to a particular field of research on the basis of the classification of the journal in which it appears. This procedure is not without problems, particularly in the case of analysis at the subfield level, but experience from our other studies has shown that the results are accurate if the data are being used to map research in a large field.3  

To what extent is Australian basic research in the medical and health sciences covered by the REPP database?

In this field, at least 70% of published research output from universities and medical research institutes is in the form of journal articles, and, of these, at least 70% appear in SCI journals.4,5 We therefore estimate that, as a minimum, the REPP database covers 50% of the published output in these two sectors. SCI coverage of research from the sectors hospitals and government institutions may be less complete in terms of their total research output, but their contributions to basic research will be well represented.

In this article, we are using the REPP database to answer questions about the map of medical research in Australia, concentrating primarily on publications from 1990 onwards, but also introducing some time-series analysis.  

Where is the research being conducted?

The sectoral distribution (Figure 1) shows that the bulk of Australia's basic medical research is located in universities and hospitals. Most of the research articles from the universities sector (75%) come from the 10 teaching medical schools and the JCSMR at the ANU.


Our data also show that little has changed in the sectoral location of research over the past 15 years: the share of publications from the hospitals sector has remained constant, there has been a small drop in the universities share, and there has been a corresponding increase in the share from the medical research institutes.  

Are patterns of medical authorship changing?

The REPP database enables us to distinguish publications by type of authorship, viz: Single author -- one author only (i.e., no collaboration); Group -- more than one author but sharing the same departmental address; Institutional -- more than one author from different departments/faculties; National -- more than one author from different institutions in Australia; and International -- more than one country listed in the author addresses.


Figure 2 shows how the type of authorship of medical research articles has changed in the past 15 years. The two authorship types showing marked decline are "single author" articles, and "group" articles. Medical and health sciences in Australia appeared slow to exhibit the "internationalisation" of research apparent in other fields since the early 1980s, but the period since 1987 has seen a dramatic change. The proportion of publications involving international collaboration nearly doubled between 1987 and 1995. Collaboration with other Australian institutions has also become more common, increasing from 19% to 26% over the 15-year period. (A detailed analysis of Australia's international collaboration in basic research may be found in a monograph we prepared for the Australian Research Council.6 )

 

How "visible" is Australian medical research?

We assess "visibility" by the number of research articles published and the citations those articles receive. Figure 3 plots Australia's share of "world" publications (i.e., of the total in the SCI) in the medical and health sciences and its share of all citations in SCI journals. The chart also plots Australia's Relative Citation Impact (RCI), which is calculated by dividing its share of "world" citations by its share of "world" publications.


The most notable feature of this Figure is that Australia's share of publications in SCI medical journals increased by 25% between 1986 and 1995. The average RCI for the whole period was 1, indicating that Australian publications are attracting appropriate notice. Australia's RCI has not changed significantly over time, but remained at or marginally above 1. This is a strong performance as citation rates are influenced principally by publications from the major research centres of America and Europe.  

Where is Australia's most "visible" medical research being conducted?

A standard measure used to compare the visibility of research in different sectors is the average number of citations received per publication (cpp). In Box 1 (below) we relate the number of publications produced by each sector in the period 1991-1995 to the number of citations those publications attracted in the same period. The leading position of the institutes making up the AAMRI is consistent with Richard Smith's impression of Australia's research institutes.7


Some of the differences in cpp rates can be attributed to the varied research profiles of different institutions. Data supplied by ISI enable us to quantify this, as we can calculate cpp rates for sets of journals. For example, articles in immunology journals for the same period attracted citations at the average rate of 6.21, while those in clinical sciences journals averaged 3.81. We would therefore expect hospitals, with their strong clinical focus, to have a lower cpp rate than those AAMRI establishments with a strong presence in immunology. However, the difference in cpp rates apparent in Box 1 (above) cannot be explained fully by differences in field concentrations; we calculate that differing citation rates across fields account for only a third of the gap in cpp rates between the AAMRI institutions and other sectors. The remainder is a measure of the differing visibility and impact of the research.

Not all institutions within a given sector have similar visibility. In Box 2 we list the top five institutions (in terms of cpp rates) in each of the four sectors active in medical research. In the universities sector, we looked specifically at the teaching medical schools and for this reason have excluded JCSMR. The institutions in Box 2 are those with more than 100 SCI publications in the period. In this instance, we count publications in multidisciplinary journals such as Nature, Science and Proceedings of the National Academy of Sciences together with publications in medical and health sciences journals, on the assumption that for the listed institutions and faculties these articles would almost certainly relate to medical research.

Sorting the institutions in Box 2 by cpp rates within sectors has the effect of allowing volume of publications to be moderated by impact. The consequences of the choice of the measure on which to rank institutions can be seen by a closer examination of the medical research institutes. While the Walter and Eliza Hall Institute of Medical Research has the largest number of publications and citations, and would be ranked first if these were the measures used, the Ludwig Institute for Cancer Research is top-ranked on the basis of average cpp rates. All three are measures of impact and visibility, but cpp rates take institutional size into account.

We looked at the research focus of institutions to determine if the higher cpp rates of some institutions resulted from differing fields of concentration. Again, we found that while an institution's cpp rate was affected by the relative impact of the fields in which it was active, this accounted at most for only 30% of the variation in cpp rates between institutions in any given sector.  

Where are Australia's most highly cited medical articles produced?

We identified a very small group of 12 articles, published since 1990, which have attracted more than 200 citations (Box 3). This Box does much to explain the ranking of the AAMRI institutes in Box 2. The Ludwig Institute for Cancer Research had only 135 publications satisfying our criteria of publication date and journal, yet three of these have attracted more than 200 citations. St Vincent's Institute of Medical Research had even fewer publications (130), but claimed authorship of the most highly cited publication for the period, with 551 citations, and another article with 256 citations. Seven of the listed articles, including the most highly cited publication, were "wholly" Australian; the other five articles involved international collaboration.  

Discussion

Studying ISI journals in isolation does not permit conclusions to be drawn about quality. However, an analysis of citations can provide a guide to the source of Australia's most visible research, and there is a well established positive association8 between high visibility in ISI-indexed journals and research judged on other grounds, such as via peer evaluation and esteem measures, to be of high quality.

The limitations of bibliometric analysis are well documented.9 Publications can attract large numbers of citations because they contain error, or because they report a new technique with wide application. Citation data are also highly skewed. Many publications attract no citations at all, and most of those that do receive only one or two. These and other problems are of little consequence when the focus is at the national or sectoral level, involving large numbers of publications.10

Our data provide only one approach to constructing a profile of Australian medical research. However, bibliometric or literature-based analysis cannot stand in isolation from historical and other kinds of evaluative judgements, and should not be used in a policy setting apart from those perspectives. That said, we believe that the information reviewed here does allow some interesting points to be made.

The most encouraging inference we draw from our study is that, using the measure of Relative Citation Impact, Australian medical research stands relatively high in terms of international visibility. As Box 3 makes clear, Australian-based researchers publish in international journals and have well established links to collaborative projects in the major centres of work in the field.6

The bulk of Australia's basic research in the medical and health sciences comes from the universities and hospitals, but Australia's medical research institutes, the members of AAMRI, have the highest international profiles. Research from these institutions has had a major impact on the international community. However, high visibility is not confined to the AAMRI institutions, and, as we have shown, research achieving very high impact also comes from hospitals and universities.

One of the most interesting policy issues which these data raise is the efficacy of block funding by comparison with direct project funding of research. Medical research in Australia is undertaken in a pluralist system, with major contributions from hospitals, universities, AAMRI and government institutions. While at first glance our data may appear to argue for block funding through the prominence of several AAMRI institutions funded in this way, the situation is more complex. Many of the AAMRI institutions are block-funded, but in some instances this accounts for as little as 35% of their total income. Any conclusions about funding await the completion of detailed bibliometric studies of the relative performance of medical research, in which we will attempt to identify the output of research supported by the varying modes of research funding.  

References

  1. Bourke P, Butler L. A Crisis for Australian science? Canberra: Performance Indicators Project, Australian National University, 1993. (Monograph Series No. 1.)
  2. < http://coombs.anu.edu.au/Depts/RSSS/REPP/repp.htm >
  3. Butler L, Bourke P, Biglia B. CSIRO: profile of basic research. Canberra: Research Evaluation and Policy Project, Australian National University, 1997. (Monograph Series No. 4.)
  4. National Board of Employment Education and Training (NBEET). Quantitative indicators of Australian academic research. Canberra: AGPS, 1994. (Commissioned Report No. 27.)
  5. Bourke P, Butler L. Monitoring research in the periphery. Canberra: Research Evaluation and Policy Project, Australian National University, 1996. (Monograph Series No. 3.)
  6. National Board of Employment Education and Training. International links in higher education research. Canberra: AGPS, 1995. (Commissioned Report No. 37.)
  7. Smith R. Top of the pile: the institutes. BMJ 1991; 302: 1006-1010.
  8. Narin F. Evaluative bibliometrics. Cherry Hill, NJ: Computer Horizons Inc, 1976.
  9. Van Raan AFJ, editor. Handbook of quantitative studies of science and technology. Amsterdam: Elsevier Science Publishers, 1988.
  10. Garfield E. In: Evered D, Harnett S, editors. Ciba Foundation Conference: the evaluation of scientific research. Chichester (UK): John Wiley & Sons, 1989.

(Received 6 Jun, accepted 29 Sep, 1997)  


Authors' details

Research Evaluation and Policy Project, Research School of Social Sciences, Australian National University, Canberra, ACT.
Paul F Bourke, PhD, FASSA, Head of Research Evaluation and Policy Project; and Professor of History.
Linda Butler, BEcon, Research Officer.
Reprints: Professor P F Bourke, Research Evaluation and Policy Project, Research School of Social Sciences, Australian National University, ACT 0200.
E-mail: paulb AT coombs.anu.edu.au

- ©MJA 1997



Readers may print a single copy for personal use. No further reproduction or distribution of the articles should proceed without the permission of the publisher. For permission, contact the Australasian Medical Publishing Company
Journalists are welcome to write news stories based on what they read here, but should acknowledge their source as "an article published on the Internet by The Medical Journal of Australia <http://www.mja.com.au>".

<URL: http://www.mja.com.au/> © 1997 Medical Journal of Australia.

Received 24 September 2018, accepted 24 September 2018

  • Paul F Bourke
  • Linda Butler


Correspondence: 

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Responses are now closed for this article.