Connect
MJA
MJA

Medical research in New South Wales 1993-1996 assessed by Medline publication capture

Emmanuel J Favaloro
Med J Aust 1998; 169 (11): 617-622.
Published online: 14 December 1998

Medical Research Perspectives

Medical research in New South Wales 1993-1996 assessed by Medline publication capture

Emmanuel J Favaloro

MJA 1998; 169: 617-622

Abstract - Introduction - Methods - Results - Discussion - Conclusions - Acknowledgements - References - Author's details
- - More articles on Informatics and computers


Abstract

Objectives: To assess medical research publication output in New South Wales (NSW).
Design: Analysis of publication information from the Medline indexing database, 1993-1996 inclusive.
Setting: Teaching hospitals and affiliated universities and medical research institutes within NSW, the major sites for NSW medical science publications.
Major outcome measures: Cumulative number and location of Medline-identified publications; journal citation indices (impact factor and immediacy index).
Results: 8860 published articles were captured for the analysis period. Universities and hospitals accounted for most of the publications (n = 7755). A mean of 73.1% (range, 36%-100%) of all articles were published in overseas journals, and the rest in Australian journals. This average trend applied to most universities and teaching hospitals, whereas research institutes published almost exclusively in overseas journals. Average publication impact factor values for most universities and teaching hospitals were around the average value for all NSW publications (2.203). The range for teaching hospital publications was 1.000-2.823, but for the overseas-publishing medical research institutes it tended to be higher (2.480-5.423). Immediacy index data yielded similar findings.
Conclusions: The universities and teaching hospitals account for most of the medical publications arising from NSW, and also those appearing in Australian journals. Thus, these sites provide the bulk of Australian medical practice end-user information. In contrast, the medical institutes concentrate on publishing in overseas journals with higher and quicker citation rates (higher impact factor and immediacy index).


Introduction

The Medline database can be used to capture research publications arising from within one institution or the search can be expanded to include, for example, all Medline-held publications from one State. Further, to help analyse publication activity, publication data derived from Medline searches can be merged with markers of publication citation, such as "impact factor" and "immediacy index". This latter information comes from the Institute for Scientific Information (ISI; Philadelphia, USA), which catalogues most of the major research journals in Science Citation Index (SCI) publications,1 and publishes "impact factor" and "immediacy index" data. Data on impact factors are often also used as surrogate markers of publication "quality", although there is much debate about the validity of this approach.2-5

Recently, Bourke and Butler analysed individual article citation rates as a measure of basic medical and health sciences research in Australia and concentrated largely on universities and high profile research institutes.6 Although these authors have produced several other reports analysing Australian research activities,7 none has looked specifically at teaching hospitals, "affiliated" research institutes (defined as the "research arm" of the hospital) and associated universities within New South Wales (NSW).

This article reports the results of such a sampling analysis.


Methods

A complete description of the methods used is beyond the scope of this report. In brief, the Medline publication indexing database (Silverplatter version) was used to capture "quantitative" research publication information. This database indexes most major journals in medical and related fields (veterinary, pathology and cell science), and each publication indexed includes the primary author's address.

A hospital-affiliated medical research institute was defined as such if more than a third of its publications included the teaching hospital as part of the Medline address field. That is, the medical research institute itself, and its researchers, clearly recognised the affiliation within the context of the publication. Adoption of this strict criterion resulted in exclusion of some research institutes, or their not being identified as hospital-affiliated sites, despite being "popularly perceived" as affiliated with a particular teaching hospital. The teaching hospitals, their affiliated universities and medical research institutes included in this report are listed in Box 1.

Data capture: The approach which was found to best capture publication data was to use (i) general ("primary") locality markers (ie, the terms "New South Wales", "NSW", "Sydney"), and (ii) separate specific ("secondary") locality markers (ie, names of "suburbs" in which the medical research organisations were located). This led to more complete capture of relevant publications than use of specific organisational names because of inconsistencies in the organisational names in the publications (eg, University of Sydney v. Sydney University, etc). Captured data were then merged with journal citation data.

Years of analysis: Medline capture and analyses were conducted for the years 1993 to 1996 (inclusive), as these were the most recent four consecutive years for which SCI-published impact factor and immediacy index information was available.

Subsequent analysis: Applicable publications were downloaded to a personal computer, and a composite database was constructed specifically for analysis of derived information.

Author's address: Author's address, and thus research institution location, was defined essentially as indicated by the downloaded author's address and Medline publication information, always attempting to be as objective as possible. If a publication noted Sydney University as the author's address, then Sydney University was the designated publication address. Where multiple authors' addresses were noted then the address was "shared" (Box 2).

Analysis of research output and outcome measures:

Research output can be assessed in various ways according to its underlying purpose. Thus, several objective analyses were undertaken to answer separate questions, including:


Results

Publication output

The composite NSW-derived Medline-captured database (1993-1996 inclusive) showed that "medical, veterinary, pathology and cell science" researchers (having given one of various localities within NSW as their address) published 8860 articles in this period.

A total of 7755 of the NSW-based publications captured by Medline arose from the universities and hospitals. Figure 1 shows data for teaching hospitals and research institutes with more than 35 publications, and Figure 2 shows data for universities affiliated with these research sites.

Figure 1: Medline-captured publications arising from NSW teaching hospitals and medical research institutes (1993-1996 total; only those sites with more than 35 publications over this period are shown). Data shown as a composite bar graph for those hospitals with an affiliated research institute (as defined in Methods). POWH=Prince of Wales Hospital.
Figure 2 (inset): Medline-captured publications arising from NSW universities affiliated with the research sites in Figure 1 (1993-1996 total).

Overseas v. Australian publication

More than 70% (73.1%; range, 36%-100%) of the research publications from NSW researchers appeared in overseas journals and the remainder in Australian journals, with a similar division evident across different hospital and university sites (Figure 3). In contrast, the medical research institutes tended to publish almost exclusively in overseas journals.

Figure 3: Medline-captured publications arising from NSW research.Percentage of journal articles in Australian journals v. overseas journals; data shown are 1993-1996 average. Data shown separately for the hospitals, universities and research institutes in Figures 1 and 2. POWH=Prince of Wales Hospital.

Journal citation data

Excluding publications not listed in the SCI1 publication statistics, the NSW-based average impact factor (all NSW Medline-captured journal publications) was 2.203 (1993-1996 averaged). Averaged impact factor and averaged immediacy index data for each research organisation are shown in Figure 4. For comparison, data for hospital, university and medical research institute sites are shown separately.

Figure 4: Average impact factor and immediacy index (1993-1996 average) for Medline-captured publications arising from NSW research. Data shown separately for the hospitals, universities and research institutes in Figures 1 and 2. (Average impact factor = cumulative impact factor of all journal publications for each research site divided by total number of journal articles captured for that site. Average immediacy index = cumulative immediacy index of all journal publications divided by total number of journal articles captured for that site.) POWH=Prince of Wales Hospital.

Average impact factor values for most hospital sites (around 2) are similar to those generated from each of the affiliated universities. Indeed, the "all hospital average" impact factor was similar to the "all university average" impact factor. In general, the research institutes' tendency to publish in overseas-based journals increased their impact factor averages (Figure 4).

The pattern of immediacy index data closely followed the pattern observed for impact factor (Figure 4).


Discussion

General findings

This report confirms the importance of the NSW teaching hospital system in ongoing medical research and teaching activity. Together with their affiliated universities, these sites provided the great bulk of medical publications arising from NSW. While the research institutes tended to target "international" journals directed at specialised scientific research, teaching hospitals and their affiliated universities targeted Australian journals as well.

This means that the "non-research institute" sites play the predominant role in providing local educational support to Australian health practitioners via journals published and widely read within Australia. Although publishing in Australian journals often carries less international "prestige" or "visibility" than publishing in overseas journals, Australian publications play an important role in education of medical, scientific, nursing and allied health practitioners. Furthermore, Australian publications have particular relevance to Australian medical practice, with reporting of specific local data or locally relevant issues (eg, local epidemiological or local infectious disease data).

Advantages and disadvantages of the method

Data were collected by a well recognised and accepted method of capturing publication information. The Medline database has previously been used with success in bibliometric studies to show publication trends,8-10 and has been consistently shown to provide the strongest health discipline indexing coverage when compared with other databases.11-15

Based largely on internal comparative research estimates, the method used would be expected to capture in excess of 60% of the published research output from NSW medical researchers, and is thus only an approximation of the level and scope of all such activity. Missing would be:

  • Valid research publications not indexed by any indexing service (eg, reference book chapters, articles in popular science or society journals); and
  • References in non-Medline indexing services.
Moreover, the Medline indexing service may exclude certain journals preferentially favoured by some research organisations.

However, an advantage of the method is that it does not capture:

  • Non-standard publications (which would complicate any analysis);
  • Low grade ("self-promotional") publications which may be included in subjective analyses (eg, publications in annual research reports, "in press" publications, conference presentations); and
  • It minimises the likelihood of "tally duplication" because of research collaborations.
Some exclusions or errors in location assignment will have occurred despite extensive cross-checking to certify a publication's origin. I relied almost entirely on authors to provide their own affiliation information. However, this was not always provided, or the address details were not always consistent. Finally, Medline catalogues only the primary author's address, so that publications of research collaborations for which the base research site (ie, principal author(s) address) is (i) outside the Medline capture search limit, or (ii) does not contain collaborating ("secondary") authors' addresses will be excluded from the tallies from these institutions.

Importantly, while research output will necessarily be underestimated by this process, it would be expected that (apart from the limitations noted above) no individual hospital or other research organisation would be differentially disadvantaged in a direct-comparison process. The process employed within this report can easily be validated by any institution should there be any concern regarding objectivity or validity.

Journal citation analysis

The research institutes tended to target journals with higher and quicker citation rates (ie, average impact factor and immediacy index, respectively, was highest in their chosen journals), partly because these institutions published almost exclusively in overseas journals. These tend to have higher citation rates, and thus higher impact factor and immediacy index values (as shown by a breakdown of average impact factor values for SCI-listed journals with publications from NSW in 1993-1996 found by Medline capture) (Box 4). Impact factor listings derive from the United States and most strongly favour US journals.3

Recalculating average research site impact factor data for overseas publications only (ie, excluding Australian publications) tends to increase the relative average impact factor value for most hospital sites by between 0.5 and 1.0, and thus brings their average values very close to those of the research institutes. The medical research institutes concentrate more specifically on publishing specialist research work targeted at other research scientists, which cultivates further research activities and citing of publications by research peers, maintaining these journals' high impact factors. The core hospital sites publish this sort of research as well as publishing "generalist education" or "local content" papers aimed at non-research ("end-user") health specialists. Both forms of publication are valid and valuable, although only the former has high "international visibility".

That immediacy index data largely follow the pattern of the impact factor data suggests that use of both citation markers may not be required in subsequent analyses, and that journals with high citation rates (impact factor values) also have quicker citation (immediacy index) values (confirmed by review of the Medline database and SCI publications1 -- data not shown).

Citation values as markers of research quality

Publication citation data are popularly used as a surrogate marker of publication "quality" on the premise that the higher the citation rate, the greater the scientific quality of the articles in that journal. The validity of this approach can be questioned and has given rise to much recent debate.2-5,16-18 The main problems relate to:

  • Accessibility of journals and journal listing bias, with only journals in the SCI database included in the analysis and listings strongly favouring English language journals published in the United States;
  • Inferences that a journal's impact factor or citation pattern reflects each article's citation pattern (which is not the case; also article citation rates determine journal impact factor, not the other way around), and that citation reflects scientific quality (not necessarily -- an article may be cited often to exemplify a scientific flaw);
  • Citation bias, as there is no correction for the influence of self-citation in impact factor calculations (authors and journals may both favour self-citation); and
  • Specialty bias, as impact factors differ according to the research field.

There are many other potential traps.2-5 However, the impact factor process is relatively easy, and it offers an achievable comparison process, so its popular use as a quality marker persists. In practice, the impact factor may better reflect "international visibility" than "scientific quality". Furthermore, use of averaged data (eg, average impact factor and average immediacy index) perhaps shows "average visibility". A fairer comparative process might be to correlate total (or cumulative) impact factor as a marker of "total visibility", as shown in Figure 5.

Figure 5: Total, (ie, cumulative) impact factor (1993-1996 period) for Medline-captured publications arising from NSW research. Data shown as a composite bar graph for those hospitalswith an affiliated research institute (as defined in Methods). Cumulative impact factor derived by addition of individual impact factor values from all journal publications for each research site. POWH=Prince of Wales Hospital.

In a recent comparable analysis, Bourke and Butler6 assessed Australia's basic research in the medical and health sciences, using individual article citation rates to assess the "visibility" of research in different general research sectors (ie, universities, hospitals, medical research institutes, other institutions) by calculating the average number of citations received per publication (cpp). They also listed specific institutions with high cpp values. Use of this more specific marker of article citation, rather than journal citation, overcomes some of the limitations noted above of the use of impact factor as a surrogate marker of publication quality. However, use of cpp is still based on the premise that a high citation rate reflects high scientific quality. With this tool, Bourke and Butler6 concluded that "the bulk of Australia's basic research in the medical and health sciences comes from the universities and hospitals, but Australia's medical research institutes, the members of AAMRI [Australian Association of Medical Research Institutes], have the highest international profiles". In this our data agree. Thus, while these institutes produce publications which appear in the most highly "visible" journals, the universities and hospitals produce the vast bulk of the research output, and contribute most to local medical issues.

However, if, as an exercise, Australian journal publications are excluded from the calculations, the remaining overseas-published medical research from the universities and hospitals also appears in these highly "visible" journals.

Finally, it is likely that some significant under-representation of specific teaching hospital sites would have occurred in Bourke and Butler's6 study, as "where a research group based in a hospital with a university connection lists the university in the address, we consistently assign the publication to the university". In this report, publications are assigned to both the university and the hospital, and comparisons then made differentially. Based on the Medline database, an average of 14% (but, on a case-by-case basis, up to 43%) of hospital publications include a university address.


Conclusions

This survey confirms the important contribution to medical research and teaching made by NSW teaching hospitals. Thus, both publication output and perceived scientific quality and visibility can be considered as high compared with peer NSW research institutes. In addition, it is the non-research-institute-based hospital sites that most significantly contribute to research published in Australia. It is hoped that the process of identifying quality research from Australian research institutions, as outlined in this report, will help promote research activity and its future development at these sites, and help to set benchmark standards for this research and teaching activity.


Acknowledgements

The concept derives from four previous internal research review reports to the Westmead Scientific Advisory Committee (SAC). Most of the data analysis was performed "after-hours", and I am grateful to Ms Beryl Dawson for her patience and understanding. Professor Tony Cunningham and Professor Cres Eastman are thanked for their support and encouragement. Sincere appreciation also to my colleague Dr Brian Nankivell, who set me upon this path of discovery, and to Ms Claire Wolczak, who sought out and found copies of the required SCI reports.

Conflict of Interest: I am employed within one of the institutions in this report, but have attempted at all times to be objective.


References

  1. Science Citation Index. Journal citation reports. A bibliometric analysis of science journals in the ISI database. Philadelphia: Institute for Scientific Information, 1993, 1994, 1995, 1996.
  2. Garfield E. How can impact factors be improved? BMJ 1996; 313: 411-413.
  3. Seglen PO. Why the impact factor of journals should not be used for evaluating research. BMJ 1997; 314: 498-502.
  4. Smith R. Unscientific practice flourishes in science. Impact factors of journals should not be used in research assessment. BMJ 1998; 316: 1036.
  5. Williams G. Misleading, unscientific, and unjust: the United Kingdom's research assessment exercise. BMJ 1998; 316: 1079-1082.
  6. Bourke PF, Butler L. The research enterprise: mapping Australia's basic research in the medical and health sciences. Med J Aust 1997; 167: 610-613.
  7. Research School of Social Sciences, Australian National University, Internet webpage (URL site: <http://rsss.anu.edu.au/>). Accessed August, 1998.
  8. Sittig DF. Identifying a core set of medical informatics serials: an analysis using the MEDLINE database. Bull Med Libr Assoc 1996; 84: 200-204.
  9. Takahashi K, Hoshuyama T, Ikegami K, et al. A bibliometric study of the trend in articles related to epidemiology published in occupational health journals. Occup Environ Med 1996; 53: 433-438.
  10. Dunn K, Chisnell C, Sittig DF. A quantitative method for measuring clinical user journal needs: a pilot study using CD Plus MEDLINE usage statistics. Medinfo 1995; 8: 1428-1432.
  11. Schloman BF. Mapping the literature of allied health: project overview. Bull Med Libr Assoc 1997; 85: 271-277.
  12. Schloman BF. Mapping the literature of health education. Bull Med Libr Assoc 1997; 85: 278-283.
  13. Wakiji EM. Mapping the literature of physical therapy. Bull Med Libr Assoc 1997; 85: 284-288.
  14. Burnham JF. Mapping the literature of radiologic technology. Bull Med Libr Assoc 1997; 85: 289-292.
  15. Burnham JF. Mapping the literature of respiratory therapy. Bull Med Libr Assoc 1997; 85: 293-296.
  16. Hecht F, Hecht BK, Sandberg AA. The journal "impact factor": a misnamed, misleading, misused measure. Cancer Genet Cytogenet 1998; 104: 77-81.
  17. Gallagher EJ, Barnaby DP. Evidence of methodological bias in the derivation of the Science Citation Index impact factor. Ann Emerg Med 1998; 31: 107-109.
  18. Opthof T. Sense and nonsense about the impact factor. Cardiovasc Res 1997; 33: 1-7.

(Received 3 Feb, accepted 20 Oct 1998)


Author's details

Institute of Clinical Pathology and Medical Research (ICPMR), Westmead Hospital, Western Sydney Area Health Service, Westmead, NSW.
Emmanuel J Favaloro, BSc(Hons), PhD, Senior Hospital Scientist, Haematology.
Reprints: Dr E J Favaloro, Senior Hospital Scientist, Haematology, ICPMR, Westmead Hospital, Westmead, NSW 2145.
Email: emmanuelATicpmr.wsahs.nsw.gov.au




Journalists are welcome to write news stories based on what they read here, but should acknowledge their source as "an article published on the Internet by The Medical Journal of Australia <http://www.mja.com.au>".
<URL: http://www.mja.com.au/>

  • Emmanuel J Favaloro



Correspondence: 

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.