Connect
MJA
MJA

Sea change: public reporting and the safety and quality of the Australian health care system

Clifford F Hughes and Patricia Mackay
Med J Aust 2006; 184 (10): S44. || doi: 10.5694/j.1326-5377.2006.tb00361.x
Published online: 15 May 2006
Evolution of quality and safety monitoring in Australia

The most startling system report was the Quality in Australian Health Care Study (QAHCS) published in 1995.2 Wilson and colleagues clearly identified that adverse events occured in Australian health care, that their frequency was significantly underestimated, and that much was needed to restore faith in the system. These findings were reinforced by the US Institute of Medicine report, To err is human, which demonstrated the magnitude of adverse events in the United States.3

Anaesthetic reports: QAHCS was not the first attempt to measure adverse events. In 1960, a Special Committee Investigating Deaths Under Anaesthesia (SCIDUA) was sponsored by the New South Wales Government. Over the next two decades, all other Australian states followed suit. Two publications on anaesthesia mortality in New South Wales4,5 attracted worldwide interest and prompted regular publications and reports by each of the state anaesthesia mortality committees. In 1976, the Victorian Consultative Council on Anaesthetic Mortality and Morbidity was established, with its terms of reference expanded to include anaesthesia morbidity. Over the ensuing 30 years, the focus has changed from analysis of anaesthetic and perioperative mortality to detailed consideration of morbidity, with provision of feedback to practitioners both directly and via a web-based information service.

In 1990, jurisdictional anaesthesia mortality data were pooled in a national report compiled by the National Health and Medical Research Council and, from 1995, by the Australian and New Zealand College of Anaesthetists (ANZCA).

Despite a high participation rate by anaesthetists, these reports were limited by varying state legislation, their voluntary nature, and limited information on the numerators (incidence of death) and denominators (population at risk). Even so, the reports clearly demonstrate a substantial reduction in the incidence of anaesthesia-related mortality. This success prompted other parts of the health care system to follow suit.

Perinatal reports: In addition, since 1964, reports on perinatal mortality have been published from around Australia. Victoria, Western Australia, the Northern Territory and NSW each publish annual reports on maternal and perinatal deaths, and the Australian Institute of Health and Welfare provides national statistics.

Surgical reports: In 1994, the NSW Minister for Health initiated a Special Committee Investigating Deaths Associated With Surgery (SCIDAWS). Its purpose was to collate surgical mortality data submitted voluntarily, particularly for the purpose of educating surgeons and health care providers. This committee shared resources with SCIDUA; both relied on the commitment of a small group of practitioners to review reported deaths, to provide feedback to the reporting practitioners, and to aggregate and analyse the data.

The surgical committee was much less successful than its anaesthetic counterpart. This was in part due to the wide variety of surgical specialties, compared with the relative homogeneity of anaesthesia practice, and to a very inconsistent commitment to reporting systems between surgical units.

Analysis of critical incidents and crisis management

In 1988, a national anaesthesia incident monitoring system (Advanced Incident Management System [AIMS]) was developed, with its base in South Australia. The system was designed to implement effective procedures for analysis of critical incidents and development of crisis management.

In 1993, the first 2000 incidents were reported,6 and numerous subsequent publications accelerated the introduction of mandatory monitoring standards. These have significantly reduced perioperative hypoxic deaths. In addition, management algorithms were developed and regularly improved, and are now widely used. AIMS has made a valuable contribution to reducing morbidity and mortality, along with education efforts by ANZCA, and the introduction of new agents and techniques.

Nevertheless, there was significant disquiet. Occasional catastrophic events, both in Australia (King Edward Memorial Hospital Inquiry8) and internationally (Bristol Royal Infirmary Inquest9 and the Manitoba Pediatric Cardiac Surgery Inquest9) indicated that the health care system was reacting to “disasters”, rather than developing systematic approaches to quality and safety.

Other developments

Systemic reporting remained elusive, but isolated examples of reporting began to appear. In cardiac surgery, the Victorian committee of the Australasian Society of Cardiac and Thoracic Surgeons developed a state-based report on mortality and morbidity after cardiac surgery. De-identified data on individual hospital cardiac surgery units have been published annually since 2001–2002.13 This report was one of the first partially risk-adjusted mortality reviews in a specialty in Australasia. It built on the work of the National Heart Foundation and the Australian Institute of Health and Welfare, which had published annual reports on mortality in cardiac surgery for many years. Other state committees within the specialty of cardiac surgery are following.

Incidents related to specific devices are also monitored. The first device-specific register was established by the Australia and New Zealand Heart Valve Registry in 1998 and has followed up almost 1000 patients who received Bjork–Shiley CC valves between 1979 and 1984. The 20-year survival of the Australian cohort is about to be reported (A Callaway, Australia and New Zealand Heart Valve Registry; C F Hughes, unpublished observation).

In addition, the Australian Orthopaedic Association has developed a register of joint replacements.14 The centralisation of data in this register and liaison with similar registers in other countries has enabled surgeons to identify devices with unacceptably high failure rates and to remove them from clinical use much more rapidly.15

The sentinel event program

In 2003, the ACSQHC commissioned a major study to identify eight major sentinel events, agreed by all jurisdictions, for compulsory annual reporting. This prompted state jurisdictions to begin developing their own process to identify adverse events.

The Victorian State Government produced its first annual report on adverse events in 2003, based on the nationally defined sentinel events. More recently, the Victorian Auditor General released the report Managing patient safety in public hospitals.16

In January 2005, NSW Health issued its first adverse event report, as part of its Patient Safety and Clinical Quality Program.17 This went far beyond the nationally defined sentinel events, including all serious adverse events identified. The reporting process builds on software provided by AIMS to create an incident information management system available electronically across the health care system. Any of the 108 000 clinician and non-clinician employees can report (anonymously if they wish) any incident — clinical or corporate — to a database jointly managed by NSW Health and the NSW Clinical Excellence Commission. All incidents are coded by severity and likelihood of recurrence (Severity Assessment Code [SAC], 1–4). NSW Health SAC1 events (the most severe) must be notified, and a root cause analysis completed within a statutory time frame. This process is protected by privilege, but a publicly available causal statement is issued on its completion. A second report on SAC1 events was released in January 2006.

The reporting rates with this electronic system have been staggering, increasing from an initial 2000 to about 10 000 reports per month. Trend analysis of system-wide data is provided to each area health service and, for the first time, information based on these data is being returned to clinicians and managers alike.

Role of clinical colleges

In addition to ANZCA and the Perinatal Society of Australia and New Zealand, other clinical colleges have had an active interest in audits as part of their commitment to professional development. The Royal Australasian College of Surgeons (RACS) monitors compliance with its recommendations for surgical audits for all fellows.

More recently, the RACS has embraced the concept of a binational audit of surgical mortality based on the Western Australian Audit of Surgical Mortality,18 now in its third year of reporting surgical outcomes. South Australia and Tasmania already participate in this audit, and others are expected to follow, including New Zealand.

In Western Australia, 73% of surgeons indicated that they had changed their practice, and 11% knew of surgeons who had done so, as a result of participation in the audit.18 Practice modification depended not on the public reports produced annually, but on the regular individual feedback provided to surgeons, and the illustrative information provided to the entire profession.

NSW will develop a virtually identical model, coordinated by the Clinical Excellence Commission, to replace the SCIDAWS program. The new program will begin by selectively targeting specialty groups, to avoid duplication of reporting from specialties such as cardiac surgery and neurosurgery, which are developing their own specialty-based audits.

The sea change has now reached maturity. We live in a health environment in which data can become information, information applied can become effective knowledge, and clinical pathways and guidelines can be developed on the available evidence and tested against predetermined performance measures. Many accreditation systems have evolved. For instance, the Australian Council on Healthcare Standards is now contracted to accredit 94% of hospitals Australia-wide. Much of this surveillance is based on outcome data regularly supplied by clinicians and institutions.

Future developments

The debate has now turned to public reporting of individual clinician performance and is by no means limited to Australia.19-23 The fact remains that performance data are already available and will be made public from time to time in circumstances which may be both helpful and harmful to practitioners and patients alike. This debate should also include the issue of a unique patient identifier, and whether it will be part of a clinical and secure medical information program or a de facto national identity card.

It is also important that we continue to develop evidence about the use of public reports. In an article comparing public reporting in the United States and the United Kingdom, Marshall and colleagues stated that “the trend towards increased disclosure of quality information is, in our view, irresistible and, therefore, determining what style of reporting works best and in what circumstances is a major policy task”.19 The impact that public reports have on patient confidence, on the behaviour of policy makers, and on clinicians needs to be evaluated in a scientific manner.

Marshall and colleagues also reported that a consistent finding in both the US and the UK is the lack of public interest in quality reports.19 While public reporting has the potential to bring about change, its purpose needs to be clarified and debated by the entire community.

An interesting area of research will be whether the changes that follow public reporting relate to better delivery of health care or better risk management (avoidance), as high-risk cases are shifted to other practitioners, institutions or even modes of care. Marshall et al demonstrated clearly that the most significant benefits occur at the local level, when individual hospitals or surgeons review their own performance in their own environment and make decisions to improve outcomes in comparison with reliable benchmarks.19

From the comfort of our beach chairs, we observe the relentless waves. The demands of clinical activity, strained resources and a limited workforce will continue to roll towards us. While redesign of the system may focus our attention on better ways of coping with these demands, we can relax somewhat in knowing that — together with all the players we have described — the seascape has changed inevitably and forever. Safety and quality, indeed effectiveness, appropriateness, equity and ease of access are now firmly in focus.

  • Clifford F Hughes1
  • Patricia Mackay2,3

  • 1 Clinical Excellence Commission, Sydney, NSW.
  • 2 Royal Melbourne Hospital, Melbourne, VIC.
  • 3 Victorian Consultative Council on Anaesthetic Mortality and Morbidity, Department of Human Services, Melbourne, VIC.



Competing interests:

Clifford Hughes is the Medical Director of Devtrack Pty Ltd, which funds the Australia and New Zealand Heart Valve Registry program. This program is partly funded by The Bowling Settlement in the USA, a court-appointed settlement following litigation around the Bjork–Shiley valve. The Registry also receives funding from the Pfizer Corporation, the original manufacturer and distributor of the valve. The Registry has independent operating procedures, and the supporting sources had no involvement in the production of this article.

  • 1. Shakespeare W. The Tempest. Oxford: Oxford University Press, 1987.
  • 2. Wilson RM, Runciman WB, Gibberd RW, et al. The Quality in Australian Health Care Study. Med J Aust 1995; 163: 458-471. <eMJA pdf>
  • 3. Committee on Quality of Health Care in America, Institute of Medicine. To err is human: building a safer health system. Washington, DC: National Academy Press, 2000.
  • 4. Holland R. Special Committee Investigating Deaths Under Anaesthesia: report on 745 classified cases, 1960-1968. Med J Aust 1970; 1: 573-594.
  • 5. Holland R. Anaesthetic mortality in New South Wales. Br J Anaesth 1987; 59: 834-841.
  • 6. Holland R, Hains J, Roberts JG, Runciman WB. Symposium: the Australian Incident Monitoring Study. Anaesth Intensive Care 1993; 21: 501-505.
  • 7. Douglas N, Robinson J, Fahy K. King Edward Memorial Hospital (KEMH) Inquiry. Perth: Department of Health Government of Western Australia, 2001. Available at: http://www.health.wa.gov.au/kemhinquiry (accessed Apr 2004).
  • 8. Bristol Royal Infirmary Inquiry. Learning from Bristol: the report of the public inquiry into children’s heart surgery at the Bristol Royal Infirmary 1984-1995. Command Paper: CM 5207. The Inquiry, 2001. Available at: http://www.bristol-inquiry.org.uk (accessed Jun 2004).
  • 9. Sinclair M. The report of the Manitoba Pediatric Cardiac Surgery Inquest: an inquiry into twelve deaths at the Winnipeg Health Sciences Centre in 1994. Available at: http://www.pediatriccardiacinquest.mb.ca (accessed Feb 2006).
  • 10. Rigg JR, Jamrozik K, Myles PS, et al; MASTER Anaesthesia Trial Study Group. Epidural anaesthesia and analgesia and outcome of major surgery: a randomised trial. Lancet 2002; 359: 1276-1282.
  • 11. Myles PS, Leslie K, McNeil J, et al. Bispectral index monitoring to prevent awareness during anaesthesia: the B-Aware randomised controlled trial. Lancet 2004; 363: 1757-1763.
  • 12. Australian and New Zealand College of Anaesthetists. Acute pain management — scientific evidence. 2nd ed. Melbourne: ANZCA, 2005.
  • 13. Reid CM, Rockell M, Skillington P, Shardey G. Victorian Cardiac Surgery Database Project. Annual Report 2001–2002. Melbourne: Australasian Society of Cardiothoracic Surgeons, 2003.
  • 14. Graves S, Davidson D, Ingerson L, et al. Australian Orthopaedic Association National Joint Replacement Registry annual report. Adelaide: Australian Orthopaedic Association, 2005.
  • 15. Graves SE, Davidson D, Ingerson L, et al. The Australian Orthopaedic Association National Joint Replacement Registry. Med J Aust 2004; 180 (5 Suppl): S31-S34.
  • 16. Auditor General Victoria. Managing patient safety in public hospitals. PP No. 121, Session 2003-05. Melbourne: Victorian Auditor General, 2005.
  • 17. NSW Health. Patient Safety and Clinical Quality Program. First report on incident management in the NSW public health system 2003–2004. Available at: www.health.nsw.gov.au/pubs/2005/pdf/incident_mgmt.pdf (accessed Feb 2005).
  • 18. Semmens JB, Aitken RJ, Sanfilippo FM, et al. The Western Australian Audit of Surgical Mortality: advancing surgical accountability. Med J Aust 2005; 183: 504-508. <MJA full text>
  • 19. Marshall MN, Shekelle PG, Davies H, Smith P. Public reporting on quality in the United States and the United Kingdom. Health Affairs 2003; 22: 134-148.
  • 20. Neil DA, Clarke S, Oakley JG. Public reporting of individual surgeon performance information: United Kingdom developments and Australian issues. Med J Aust 2004; 181: 266-268. <MJA full text>
  • 21. Marasco SF, Ibrahim JE, Oakley J. Public disclosure of surgeon-specific report cards: current status of the debate. ANZ J Surg 2005; 75: 1000-1004.
  • 22. Hughes CF, Bearham G. Surgeon specific report cards. ANZ J Surg 2005; 75: 927-928.
  • 23. Chessin MR. Achieving and sustaining improved quality: lessons from New York State and cardiac surgery. Health Affairs 2002; 21: 40-51.

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.