Connect
MJA
MJA

An improvement focus in public reporting: the Queensland approach

Stephen J Duckett, Justin Collins, Maarten Kamp and Kew Walker
Med J Aust 2008; 189 (11): 616-617. || doi: 10.5694/j.1326-5377.2008.tb02213.x
Published online: 1 December 2008

Public reporting of health care outcomes such as in-hospital mortality or complication rates is now well established internationally. It is less advanced in Australia, but has been advocated academically1 and as part of contemporary federal–state negotiations. The current political direction favours publication of risk-adjusted in-hospital mortality, infection and complication rates for each facility as the preferred indicators.2-4

Public reporting in Queensland

One of the outcomes of Queensland’s Bundaberg Hospital scandal5,6 and the associated public inquiries was a revitalisation of quality management processes and a new emphasis on transparency in the health system. The Health Services Act 1991 (Qld) was changed in 2005 to require publication of an annual public hospital performance report. A shake-up in clinical governance also occurred, with the introduction of new quality management processes that included more robust and consistent reporting of clinical incidents and sentinel events7 as well as a monitoring system using statistical process control charts for 30 clinical indicators.8-10 The statistical process control approach emphasises the dynamic nature of performance against particular outcome measures and flags significant variations from the state mean. Public and private hospitals are given feedback on their performance against the indicators on a monthly basis. Depending on the extent to which a hospital’s indicators deviate from the state average, there are requirements for reporting at various levels of the bureaucratic hierarchy, using a standardised approach to reporting findings that emphasises systematic reasons for variation.5

What is critical in the new approach is not that an indicator is flagged for further investigation, but that robust investigation takes place. Investigation reports for indicators flagged at twice the state average rate (for non-mortality indicators, such as complications of care) or 75% above the state average rate (for mortality indicators) are reviewed externally to the hospital to assess the adequacy of the hospital’s internal investigation. A rating is given for the “strength” of actions and the comprehensibility of the report for public presentation.9

This dynamic and quality improvement approach to quality management was first used as the basis for the mandated public reporting in 2008. Although quantitative performance data for each of the 30 indicators for each relevant hospital are published as a separate table on the Internet,11 the main printed public report (also available on the Internet)12 focuses on whether the indicator performance of any individual hospital was significantly different from the state average and, more importantly, the actions that the identified hospital is taking in response to flagged variations from the average. Examples of published summaries of investigations are shown in the Box. A similar approach is taken with regard to reporting on clinical incidents and sentinel events.7

Purposeful public reporting

While controversy remains about its value and purpose,13,14 the case for public reporting has been variously argued as being:

Contrary to the first stated aim, consumers appear not to rely on public reports in selecting hospitals.19 But even if patients can’t or don’t use public reporting, the other two goals (improvement and accountability) may still warrant continuation of public reporting. The two goals are linked, as presumably one purpose of public accountability is to stimulate remedial action where necessary. In a systematic review, Fung and colleagues showed that public reporting did indeed stimulate hospital quality improvement activity, but the impact on clinical outcomes was “mixed”.19 There is still controversy about the emphasis on outcome indicators (such as mortality rates) in public reporting, with a steady stream of research suggesting that process measures (such as prophylaxis against venous thromboembolism), which are rarely captured in routine data, are more useful in guiding quality improvement efforts.20-22

More fundamentally, public reports may also be criticised because they are based on an outmoded conception of the quality endeavour. The very terminology of “score cards” and “report cards” brings to mind the picture of an errant schoolboy standing in the corner awaiting discipline for poor performance,23 — part of the “name–shame–blame” culture that has pervaded the health sector’s approach to safety and quality in the past. A “just” culture is now seen as critical to redressing quality problems, but unfortunately, this approach appears not to have permeated to the reporting advocates and so there has been no reconceptualisation of public reporting. The name–shame–blame approach, criticising the data and often “shooting the messenger”, encourages a defensive response by hospitals.

Queensland Health’s approach responds to two of the aims of public reporting: providing public accountability and stimulating action. In terms of the latter, it does this not by trading on hospitals’ concern with risk to their reputation, but by requiring and reporting on results of actual investigations. Although media reporting of the new approach in Queensland still focuses on naming, shaming and blaming,24-26 this is to some extent characteristic of the tabloid approach that pervades, but is not unique to, journalism in Queensland.

Conclusion

Health professionals are now encouraged to report incidents and near misses so that we can learn from them. This involves creating internal cultures free from inappropriate blame. But the external reporting environment for most hospitals has not kept pace with this change in internal culture. The emphasis is still generally on a cross-sectional, static approach to identifying poorly performing facilities that involves naming and shaming hospitals to stimulate action for improvement.

In contrast, what we are trying to do in Queensland is to change the emphasis of public reporting from simply pointing the finger at hospitals whose performance is below average to focusing on what action is being taken to improve their performance. We are thus trying to focus on a quality improvement outcome rather than on public shaming.

Examples of published summaries of local investigations

Laparoscopic cholecystectomy complications of surgery: Gold Coast Hospital

A review revealed data issues caused by poor documentation in patient medical charts. The Director of Surgery is working with the Coder Educator to make sure that documentation is clear and understandable so that the information can be coded correctly.

Paediatric tonsillectomy and adenoidectomy readmission: Royal Children’s Hospital

Hospital examination has highlighted a potential management issue. The Ear, Nose and Throat Clinical Nurse Consultant (CNC) was on extended leave during the latter part of 2006, which may have impacted on the level of advice and education being provided to both parents and patients. The CNC has incorporated age-specific education into the CNC Succession Plan in order to provide appropriate descriptions and education to all of the patient population.

Paediatric tonsillectomy and adenoidectomy readmission: Mater Children’s Public Hospital

Data, patient casemix and process of care were three areas identified as contributing to this flag. As a result, Mater Children’s Public Hospital has implemented a clinical care pathway. Data reviewed were found to contain coding errors. These errors have been fixed and the data have been resubmitted.

Heart failure in-hospital mortality: Townsville Hospital

Hospital review revealed that data and casemix issues significantly contributed to this flag. This review revealed a cohort of complex high-risk patients with end-stage diseases. There were no patterns or significant omissions in care. Issues with incomplete discharge summaries and coding were discovered, and education sessions are scheduled for each specialty area within the hospital. Implementation of a mortality review process for every patient death should contribute to clinical coding accuracy in the future.

  • Stephen J Duckett1
  • Justin Collins2
  • Maarten Kamp2
  • Kew Walker2

  • 1 Centre for Healthcare Improvement, Queensland Health, Brisbane, QLD.
  • 2 Clinical Practice Improvement Centre, Queensland Health, Brisbane, QLD.



Competing interests:

All authors are employees of Queensland Health, which is the subject of this article.

  • 1. Clarke S, Oakley J. Informed consent and surgeons’ performance. J Med Philos 2004; 29: 11-35.
  • 2. Metherill M. Hospital bungles to be exposed. Sydney Morning Herald 2008; 23 Jul. http://www.smh.com.au/news/national/bungles-to-be-exposed/2008/07/22/1216492448149.html (accessed Oct 2008).
  • 3. Wallace N. Rudd plan to publish death rates. Sydney Morning Herald 2008; 5 Jul. http://www.smh.com.au/news/national/pm-plan-to-publish-death-rates/2008/07/04/1214951042669.html (accessed Oct 2008).
  • 4. Australian Institute of Health and Welfare. A set of performance indicators across the health and aged care system. Canberra: AIHW, 2008. http://www.aihw.gov.au/indicators/performance_indicators_200806_draft.pdf (accessed Oct 2008).
  • 5. Van Der Weyden MB. The Bundaberg Hospital scandal: the need for reform in Queensland and beyond. Med J Aust 2005; 183: 284-285. <MJA full text>
  • 6. Thomas H. Sick to death: a manipulative surgeon and a health system in crisis — a disaster waiting to happen. Sydney: Allen and Unwin, 2007.
  • 7. Wakefield J. Patient safety: from learning to action. First Queensland Health report on clinical incidents and sentinel events. Brisbane: Queensland Health, 2007. http://www.health.qld.gov.au/patientsafety/documents/patsafereport.pdf (accessed Oct 2008).
  • 8. Duckett S. A new approach to clinical governance in Queensland. Aust Health Rev 2007; 31 Suppl 1: S16-S19.
  • 9. Duckett SJ, Coory M, Sketcher-Baker K. Identifying variations in quality of care in Queensland hospitals. Med J Aust 2007; 187: 571-575. <MJA full text>
  • 10. Clinical Practice Improvement Centre. VLADs for dummies. Milton, Qld: Wiley, 2008.
  • 11. Queensland Health. Additional table of results. Moving ahead: Queensland public hospitals performance report 2006–07. Brisbane: Queensland Health, 2007. http://www.health.qld.gov.au/performance/docs/Additional_Table.pdf (accessed Oct 2008).
  • 12. Queensland Health. Moving ahead: Queensland public hospitals performance report 2006–07. Brisbane: Queensland Health, 2007. http://www.health.qld.gov.au/performance/docs/phpr_2006-07_full2.pdf (accessed Oct 2008).
  • 13. Pronovost PJ, Miller M, Wachter RM. The GAAP in quality measurement and reporting. JAMA 2007; 298: 1800-1802.
  • 14. Colmers JM. Public reporting and transparency. New York: The Commonwealth Fund, 2007.
  • 15. Berwick DM, James B, Coye MJ. Connections between quality measurement and improvement. Med Care 2003; 41 Suppl 1: I30-I38.
  • 16. Hibbard JH, Stockard J, Tusler M. Hospital performance reports: impact on quality, market share, and reputation. Health Aff (Millwood) 2005; 24: 1150-1160.
  • 17. Hamblin R. Publishing “quality” measures: how it works and when it does not? Int J Qual Health Care 2007; 19: 183-186.
  • 18. Mason A, Street A. Publishing outcome data: is it an effective approach? J Eval Clin Pract 2006; 12: 37-48.
  • 19. Fung CH, Lim YW, Mattke S, et al. Systematic review: the evidence that publishing patient care performance data improves quality of care. Ann Intern Med 2008; 148: 111-123.
  • 20. Davies HT, Crombie IK. Interpreting health outcomes. J Eval Clin Pract 1997; 3: 187-199.
  • 21. Crombie IK, Davies HT. Beyond health outcomes: the advantages of measuring process. J Eval Clin Pract 1998; 4: 31-38.
  • 22. Lilford RJ, Brown CA, Nicholl J. Use of process measures to monitor the quality of clinical practice. BMJ 2007; 335: 648-650.
  • 23. Frank T. What’s the matter with Kansas? How conservatives won the heart of America. New York: Metropolitan Books, 2004.
  • 24. Wardill S. Queensland’s public hospitals fail health tests. Courier Mail (Brisbane) 2008; 11 Feb. http://www.news.com.au/couriermail/story/0,23739,23191353-5003426,00.html (accessed Oct 2008).
  • 25. Davies H. Hospitals fail new mothers. Courier Mail (Brisbane) 2008; 2 Mar. http://www.news.com.au/couriermail/story/0,23739,23302715-5003426,00.html (accessed Oct 2008).
  • 26. Marshall MN, Romano PS, Davies HT. How do we maximize the impact of the public reporting of quality of care? Int J Qual Health Care 2004; 16 Suppl 1: i57-i63.

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.