Connect
MJA
MJA

Public reporting of individual surgeon performance information: United Kingdom developments and Australian issues

David A Neil, Justin G Oakley and Steve Clarke
Med J Aust 2004; 181 (5): 266-268.
Published online: 6 September 2004

Abstract

  • The United Kingdom is currently introducing public reporting of performance information for individual cardiac surgeons. The reports will indicate whether a surgeon has an acceptable level of performance, measured by in-hospital mortality.

  • In the United States, surgeon-specific performance data have been available for over a decade.

  • Arguments from both safety and accountability perspectives provide strong justifications for public reporting of such data.

  • Were Australia to adopt similar public reporting processes, we should learn from overseas experiences.

  • Surgical associations should be actively involved in developing data standards and processes for data collection, validation, analysis and publication.

  • Any Australian policy initiative for public reporting of individual surgeon data should be backed by a political commitment to adequate funding.

There is increasing evidence that public reporting of comparative performance data improves quality of healthcare.1 However, the most controversial question is whether patients should have access to performance data for individual clinicians. In the United States, outcomes data for individual cardiac surgeons have been publicly available in New York State since 1991, in Pennsylvania since 1992, and in New Jersey since 1994.2-4 In the United Kingdom, performance information for individual cardiac surgeons will be made public this year for the first time: the Society of Cardiothoracic Surgeons will be releasing mortality information for the approximately 180 surgeons who perform coronary artery bypass surgery.5,6

Performance reporting overseas: In the US, the key comparator reported is the surgeon’s risk-adjusted mortality rate. However, in the UK, the report will not be in the form of a “league table”, at least initially. This is primarily because thoroughly validated risk adjustment of each surgeon’s patients is not yet available. Instead, a three-star scale will be reported, indicating that a surgeon either “fails”, “meets” or “exceeds” the standards of the Society of Cardiothoracic Surgeons. The index procedure is first-time, isolated coronary artery bypass graft (CABG), with inpatient mortality as the outcome indicator. Initially, the required standard is that an individual surgeon’s mortality rate, taken as a 3-year rolling average, must be within 99.9% confidence limits of the mean mortality rate for the nation. The intention is that risk stratification will be phased in over the next few years.

To understand this initiative it helps to set it against the background of the huge structural reforms under way in the UK National Health Service, which aim to transform it from a centralised monolith into a more flexible, “patient-centred” system, with more autonomy for individual trusts.7 While these reforms devolve more managerial authority to hospitals, clinical audit has been standardised and centralised. The Commission for Healthcare Audit and Inspection (which replaced the Commission for Healthcare Improvement in April 2004) is the independent authority that undertakes clinical-governance audits for British hospitals and other healthcare trusts. All acute, specialist, mental health, ambulance and primary care trusts are audited against a range of indicators and given a rating on a three-star scale,8 which is published on the Internet (Healthcare Commission, www.chai.org.uk/Homepage/fs/en).

The principal driver for these audit and governance reforms was the Bristol Royal Infirmary Inquiry,9 which announced its findings in July 2001. That inquiry, chaired by Sir Ian Kennedy, was concerned with understanding how unacceptable mortality rates for certain surgical procedures in the Bristol paediatric cardiac unit had gone unchecked from 1984 to 1995. Here was a case where self-monitoring had manifestly failed. The report found the roots of the Bristol tragedy in the insular and conformist “club culture” of the NHS. It argued that the goals of the reform process demanded a sea change in the ethos of the hospital system.

In January 2002, the UK government’s response to the Inquiry was tabled in the House of Commons.10 Agreeing in full or in part to 187 of the Inquiry’s 198 recommendations, the government committed to far-reaching reforms aimed at developing a culture of public disclosure and accountability in healthcare. At that time, the then Secretary of State for Health, Alan Milburn, announced an agreement with the Society of Cardiothoracic Surgeons to publish mortality rates for every cardiac surgeon in Britain from April 2004. He stated that “this is just the first step to publishing more information on individual consultant outcomes over time”.11

In 1998, in the aftermath of the events at Bristol, the Society had begun collecting surgeon-specific data for certain marker operations (primarily, first-time CABG). These data are independently analysed, and the results scrutinised within the Society to provide early warning of potential problems. Surgeons whose performance lies outside predetermined limits are notified and required to respond.

Cardiothoracic surgeons were “not comfortable” with public release of individual performance information but accepted that it was inevitable. The surgeons’ main concern is that it will lead to defensive medicine and the avoidance of high-risk patients.12 If individual mortality rates are published, risk adjustment for each surgeon’s casemix is essential, not least because the sickest patients are usually assigned to more experienced consultants. Whether or not high-risk patients are disadvantaged then depends crucially on whether surgeons have confidence in the risk-adjustment process.

Reporting in Australia: There have been moves to publish comparative surgical performance data in Australia, but only at the level of cardiac units. Outcome data for individual surgeons are collected by many hospitals and by surgeons themselves, but they are not centrally coordinated into a comprehensive database, and no surgeon-specific data are available to the public.

The 1999 Victorian Health Services Policy Review recommended that the Victorian Department of Human Services (DHS) publish annual data on the comparative performance of hospitals for specific procedures, and that the Commonwealth and the States collaboratively develop a set of comprehensive, consumer-oriented performance indicators.13 In 2003, the DHS began publishing annual outcome reports for the six Victorian public hospital cardiac units, drawing on a database established by the Australasian Society of Cardiac and Thoracic Surgeons with the help of a $200 000 grant from the DHS.14-17 However, in these reports the unit-level data are de-identified. These reports show that overall cardiac surgery outcomes compare favourably with those in the US and UK. Also, there has been an overall reduction in mortality rates for CABG surgery in Victorian public hospitals from 2002 to 2003, which may have been partly due to the monitoring of unit performance in these public reports.

The Australasian Society of Cardiac and Thoracic Surgeons supports a national scheme of cardiac-surgery reporting based on the Victorian model. The Australian Council for Safety and Quality in Health Care is also developing the National Cardiac Procedures Register, which will record de-identified, risk-adjusted information on the outcomes of various procedures, such as CABG surgery and interventional cardiology procedures.18

Improving quality of healthcare: Good clinical governance requires data collection at the individual level. One of the clearest lessons from Bristol is the need for a continuous audit process to identify outliers in mortality and complication rates. When public reporting systems were developed in the US, an explicit aim was to try to improve clinical quality by changing consumer behaviour. It was hoped that better-informed purchasers would demand quality, and that poor performers would be disciplined by the market.19 CABG outcomes in New York have improved markedly since “report cards” were introduced, but not in fact because of an effect on the market share of performance outliers. Instead, healthcare quality improved as poorly performing hospitals used the data to identify problems with their processes.20

The question, therefore, is not whether these data should be collected, but whether they should be made public. The answer depends on what we think public reporting of individual performance is for — patient choice, or patient safety and professional accountability? We argue that, independent of considerations of patient choice, the arguments for both safety and accountability strongly justify public reporting of outcomes. The legitimate public interest in healthcare quality makes it very difficult to argue against the public release of performance data.21

Reporting of surgical outcomes at both unit and individual levels can have a powerful positive effect on public confidence and trust. However, overseas experience shows just how challenging an undertaking this is. Meaningful performance assessment requires the development of guidelines and standards against which performance is to be measured. Half of the surgeons on any league table will be, by definition, worse than average. For public reporting what matters most is not a ranking, but rather that surgeons are shown to meet acceptable performance standards. The development of national standards depends on centralised data collection, to build an evidence base and to identify best practice.

Before individual performance data can be publicly released, a standardised minimum dataset must be established, with uniform reporting requirements in all centres and an audit system to validate data against case notes. The data must also be robust, averaged over several years to counter statistical volatility in individual casemix, and adequately risk-adjusted.

If professional reticence about reporting surgeon-specific data is to be overcome, surgeons must have confidence in the data. This means that the surgical associations must take the lead in developing data standards, and processes of collection, validation, analysis and publication. In addition, the possible impact on access to surgical care should be considered, as well as the form and context in which the data should be made available to patients. All of this will take considerable resources, and any policy initiatives for comprehensive public reporting of outcomes in Australia must be backed by a political commitment to proper funding.

We believe that, for surgeons to meet their obligations for public accountability, public reporting of surgeon-specific performance information is ultimately needed. In the next few years, it will be important to watch how this reporting develops in the UK, whether fears of defensive medicine prove justified, and how it affects both surgeons’ and the public’s perception of the surgical profession. We hope that Australia will not need a scandal like Bristol for state and federal governments to fund the development of the databases and audit systems needed for continuous, standards-based performance management. As the Bristol example illustrates, public trust in the healthcare system is too important to be jeopardised. Once lost, it is both difficult and very expensive to regain.

  • David A Neil1
  • Justin G Oakley2
  • Steve Clarke3

  • 1 Centre for Human Bioethics, Monash University, Melbourne, VIC.
  • 2 Centre for Applied Philosophy and Public Ethics, Charles Sturt University and Australian National University, Canberra, ACT.

Correspondence: 

Acknowledgements: 

This research was supported by National Health and Medical Research Council Project Grant 236877, “An ethical analysis of the disclosure of surgeons’ performance data to patients within the informed consent process”. We thank Joe Ibrahim, Helga Kuhse, David Macintosh, Silvana Marasco, Christopher Reid, Rosemary Robins and Julian Smith for helpful discussion. However, the views expressed in this article are entirely our own.

Competing interests:

None identified.

  • 1. Marshall MN, Brook RH. Public reporting of comparative information about quality of healthcare [editorial]. Med J Aust 2002; 176: 205-206. <MJA full text>
  • 2. Center for Consumer Health C are Information. Coronary artery bypass surgery in New York State 1997-1999. New York: New York State Department of Health, 2002. Available at: http://www.health.state.ny.us/nysdoh/heart/1997-99cabg.pdf (accessed Jun 2004).
  • 3. Pennsylvania Health Care Cost Containment Council. Pennsylvania’s guide to coronary artery bypass graft (CABG) surgery 2000. Available at: www.phc4.org/reports/cabg/00/default.htm (accessed Jun 2004).
  • 4. New Jersey department of health and senior services. Cardiac surgery in 2000 in New Jersey. Available at: www.state.nj.us/health/hcsa/cabmenu.htm (accessed Jun 2004).
  • 5. Keogh B. Letter from the Secretary (UK Society of Cardiothoracic Surgeons) to the membership on 3 March 2003. Available at: www.aats.org/aatsdoc/?did=6107 (accessed Jun 2004).
  • 6. Dyer O. Heart surgeons to be rated according to bypass surgery success. BMJ 2003; 326:1053.
  • 7. Department of Health. The NHS plan. A plan for investment. A plan for reform. Presented to Parliament by the Secretary of State for Health, 2000. Command Paper: Cm 4818-I. Available at: www.nhs.uk/national plan/nhsplan.htm (accessed Jun 2004).
  • 8. National Health Service. Star ratings explained [updated 15 June 2004]. Available at: www.nhs.uk/Root/StarRatings/Explained.asp (accessed Jun 2004).
  • 9. Bristol Royal Infirmary Inquiry. Learning from Bristol: the report of the public inquiry into children's heart surgery at the Bristol Royal Infirmary 1984-1995. Command Paper: CM 5207. The Inquiry, 2001. Available at: www.bristol-inquiry.org.uk/ (accessed Jun 2004).
  • 10. Learning from Bristol: the Department of Health’s response to the Report of the Public Inquiry into Children’s Heart Surgery at the Bristol Royal Infirmary 1984-1995. Presented to Parliament by the Secretary of State for Health, 2002. Command Paper: Cm 5363. Available at: www.dh.gov.uk/assetRoot/04/05/94/79/04059479.pdf (accessed Jun 2004).
  • 11. Milburn A. Secretary of State’s speech to Parliament announcing the Government’s response to the Bristol Royal Infirmary Report. UK House of Commons, Hansard. 17 January 2002.
  • 12. Vass A. Performance of individual surgeons to be published. BMJ 2002; 324:189.
  • 13. Duckett S, Casemix Consulting, Hunter L. Health Services Policy Review. Final report - November 1999. Melbourne: Victorian Department of Human Services, Acute Health Division. Available at: www.dhs.vic.gov.au/ahs/archive/servrev/index.htm (accessed Jul 2004).
  • 14. Reid CM, Rockell M, Skillington P, Shardey G, on behalf of the Australasian Society of Cardiac and Thoracic Surgeons Victorian Database Project Steering Committee. Cardiac surgery in Victorian public hospitals: report to the public 2002. Victorian Department of Human Services. Available at: www.health.vic.gov.au/cardiacsurgery (accessed Jun 2004).
  • 15. Reid CM, Rockell M, Skillington P, Shardey G, on behalf of the Australasian Society of Cardiac and Thoracic Surgeons Victorian Database Project Steering Committee. Cardiac surgery in Victorian public hospitals 2003: report to the public. Melbourne: Victorian Department of Human Services, 2003.
  • 16. Smith JA. Towards a state-wide cardiac surgery database: the Victorian initiative. Heart Lung Circulation 2001; 10 Suppl: S26-S28.
  • 17. Reid CM, Solterbeck A, Buxton BF, et al. Developing performance indicators for cardiac surgery: A demonstration project in Victoria. Heart Lung Circulation 2001; 10 Suppl: S29-S33.
  • 18. Australian Council for Safety and Quality in Health Care. Safety & Quality Council funds cardiac register [media release]. Canberra: the Council. Available at: www.safetyandquality.org/index.cfm?page=Media# (accessed Jun 2004).
  • 19. Marshall MN, Shekelle PG, Brook RH, Leatherman S. Dying to know. Public release of information about quality of health care. Santa Monica, Cal: RAND Corporation and Nuffield Trust, 2000: 29-30, 43-44. Available at: www.rand.org/publications/MR/MR1255/ (accessed Jun 2004).
  • 20. Chassin M. Achieving and sustaining improved quality: lessons from New York State and cardiac surgery. Health Affairs 2002; 21: 40-51. Available at: www.racp.edu.au/hpu/cssp/chassin_paper.pdf (accessed Jul 2004).
  • 21. Clarke S, Oakley J. Informed consent and surgeons’ performance. J Med Philos 2004; 29: 11-35.

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Responses are now closed for this article.