Connect
MJA
MJA

Investigating apparent variation in quality of care: the critical role of clinician engagement

Andrew L L Clarke, William Shearer, Alison J McMillan and Paul D Ireland
Med J Aust 2010; 193 (8): S111. || doi: 10.5694/j.1326-5377.2010.tb04025.x
Published online: 18 October 2010

The pursuit of efficient and effective means of monitoring and improving quality and safety of patient care is occurring globally. At a national level, through the 2008 National Healthcare Agreement, the Council of Australian Governments agreed to report on national trends and the performance of all jurisdictions on a variety of health measures, including selected indicators of quality and safety. In November 2009, the Australian Health Ministers’ Conference (AHMC) endorsed the recommendation by the Australian Commission on Safety and Quality in Health Care (ACSQHC) that all Australian hospitals routinely review reports based on a series of hospital-based outcome indicators. This aligns with another AHMC initiative: for the ACSQHC to publish a national safety and quality framework. The proposed draft includes “driven by information” as one of three elements of the national framework’s vision, along with a strategy for action by health systems and providers to “continually monitor the effects of healthcare intervention”.1

At a state level, the Victorian Clinical Governance Policy Framework is used by the Department of Health to map its performance monitoring activities. Clinical effectiveness is one of the Victorian Framework’s domains of quality and safety, and two aspects specified are that:

The Framework commits the Department to develop a set of core quality and safety indicators for use in clinical governance processes, recognising that this will interdigitate with a range of national processes. Importantly, one of the Framework’s stated principles is that clinicians and clinical teams are directly responsible and accountable for the safety and quality of care they provide.

This article describes the Department’s challenge in developing a suite of indicators that aligns with clinicians’ desire to continually improve the quality of their clinical practice, the Department’s responsibility to monitor, evaluate and improve performance, and the recognition that clinicians are themselves responsible for the safety and quality of the care they provide.

Quality and safety initiatives in Victoria

Before the development of the above frameworks, the Victorian Auditor-General recommended the use of routinely collected hospital administrative data, specifically the Victorian Admitted Episodes Dataset (VAED), to monitor patient safety and quality. The Department commenced work to investigate the utility of VAED data as a means of identifying apparent variations in patient safety and quality of care related to clinical outcomes. The use of administrative data for this purpose requires the development of indicators based on appropriate coding algorithms. The Department developed a suite of 18 core indicators and seven subindicators known as the AusPSI set with coding algorithms based on International Classification of Diseases, 10th revision, Australian modification (ICD-10-AM).3

Like the Queensland Health approach,4 our intention was to develop clinical indicators to enable the investigation of apparent variations in practice. Central to this undertaking was the belief that an important distinction can be made between formative and summative assessment.5 Formative assessments (clinical indicators) are diagnostic and help clinicians take control of areas for improvement, which is appropriate as clinicians are responsible and accountable for the safety and quality of care they provide. Summative assessments (performance measures) are evaluative and used primarily for external purposes such as public reporting.

We determined that the measurement of indicators by means of administrative data offers a formative way of driving improvement at local levels. It is not clear that using administrative data to measure performance in a summative way is valid or even appropriate.6,7 This point is well made by Duckett and colleagues:

Through the Patient Safety Monitoring Initiative (PSMI), the Department drew on the Queensland Health initiative of using statistical process control charts, specifically VLADs, to identify variations in selected outcomes across hospitals. The Queensland Health approach to monitoring clinical outcomes represents a significant increase in centralised monitoring.4 However, Victoria operates a devolved hospital governance model, so the implementation of sequential monitoring using VLAD in this context is very reliant on clinician acceptance.

Through the establishment of the PSMI, the Department chose to work collaboratively with health services, and clinicians in particular, to develop a suite of indicators based on earlier work. The Department invited 20 Victorian public health services to be involved in assessing the value of 11 selected indicators.

The 11 indicators comprised six Victorian-defined indicators: Death in low mortality DRGs (diagnosis-related groups), Complications of anaesthesia, In-hospital fracture, Postoperative haemorrhage or haematoma, Postoperative deep vein thrombosis/pulmonary embolus, and Obstetric trauma — vaginal or caesarean delivery. Five Queensland Health clinical indicators were selected for initial testing in Victoria: Stroke in-hospital mortality, Heart failure in-hospital mortality, AMI (acute myocardial infarction) in-hospital mortality, Pneumonia in-hospital mortality, and Fractured neck of femur in-hospital mortality.

An introductory forum held on 1 April 2009 was attended by over 40 health service representatives, including senior clinicians, directors and managers of quality and risk, and health information managers. The frank and open discussion at this forum demonstrated that concerns about administrative data raised several hurdles.

A reference group was established, comprising health service representatives from each participating health service — mainly directors and managers of quality and risk, independent epidemiologists and the Department’s program team. The reference group was intended to provide a forum to continue and improve the collaboration between the Department and health services.

The reference group, through the program team, has sought advice about the PSMI indicators from relevant clinical peak bodies. The Victorian Consultative Council on Anaesthetic Mortality and Morbidity, while acknowledging the need for a robust indicator for complications arising from anaesthesia, highlighted the inability of the PSMI indicator to identify accurately true anaesthetic-related adverse events. In response, the Department stopped reporting this indicator. Advice from clinicians at a major maternity hospital suggested we reconsider the definition for the Obstetric trauma indicator. We responded by amending the definition in line with the advice received.

Clinicians frequently request de-identified “benchmarking” information. There is an apparent desire to know how our performance can be measured in relation to past achievement, and in relation to the achievement of others. Constant reference to our own performance can have a deleterious result, described as self-reference inertia.8 Providing a source of comparison can be a positive motivator to match and excel the achievements of others.

As we want to give a means of comparison, we provide de-identified hospital-level statewide rates in the form of funnel plots. This will provide a clearer picture to hospitals of performance for a period. We posit that control charts complement cross-sectional funnel plots rather than replace them, but agree that more work is needed to determine how best to engage end users of clinical indicators.9

We understand the at-times-elusive balance between benefit and over-investigation depends on cooperative efforts with clinicians.

Discussion

The Department is continually mindful that clinicians and clinical teams are directly responsible and accountable for the safety and quality of care they provide, and that it is important to adopt an inclusive system of review and refinement that is not adversarial or judgemental.

Participating hospitals report limited involvement of clinicians (partly due to a lack of confidence in the data source), a higher-than-anticipated level of strain on resources (although this is difficult to quantify), and too great a delay between incident and report at the hospital level.

Despite the lack of enthusiasm about the process, quality improvement can result when the initiative’s principles are applied at the clinical level. One major Victorian metropolitan hospital demonstrated this when a flag arose in the hospital’s report for Postoperative deep vein thrombosis/pulmonary embolus. An audit revealed patients included in the PSMI report had experienced thrombophlebitis, not thromboembolism.

Embracing the spirit of the PSMI, to identify possible areas for improvement, the hospital asked: “Why are we seeing so many cases of thrombophlebitis?” Clinicians discovered that intravenous (IV) appliances were being left in-situ longer than necessary. Clinicians reviewed guidelines for the removal of IV appliances and implemented an updated process. As a result, the rate of thrombophlebitis as a complication of IV appliances decreased.

This is not an experience reported by other hospitals involved in the PSMI. It indicates, however, that improvement in care by using indicators based on administrative data is possible. We recognise that the key is not the data but the involvement of clinicians.

As already noted, hospitals have expressed concern that investigation of apparent variation flagged by VLADs creates a higher-than-anticipated level of strain on resources. As well as investigating apparent variation in care, the number of indicators reported also contributes to the resource burden. It is critical not to divert clinical resources away from patient care to engage clinicians in an investigation. Diverting clinicians from patient care may itself create an environment that compromises safety and quality.

The federal government’s MyHospitals website was scheduled to be launched during August 2010. However, the caretaker period associated with the national election delayed its commencement. Information will be available on the site to compare the performances of public hospitals against national benchmarks. The intention is for information to be nationally consistent and locally relevant. MyHospitals will present hospital-level data as agreed by states and territories in the National Health and Hospitals Network Agreement. Over time, the content of the website will be expanded to add further information, and to include private hospitals.

Publicly reporting information about the performance of hospitals may help the public understand the performance of hospitals in their area. Hospitals may be motivated to improve their performance lest they appear to fall behind the pack. However, critical clinical engagement may be in jeopardy if clinical indicators developed for formative purposes are used as performance measures. Care at the bedside will only improve if clinicians have confidence in the information they receive, and have the resources required to generate improvement in care processes.

In Victoria, there are many well established clinician-driven processes for monitoring quality of patient care in specific clinical areas. Examples include the Surgical Outcomes Information Initiative, the Victorian Cardiac Surgery Database project, the Victorian Intensive Care Data Review Committee, and the Limited Adverse Occurrence Screening process. These all involve senior clinicians from multiple health services collaborating on common improvement activities. The PSMI provides an ideal opportunity for emerging clinical leaders to take local ownership and develop expertise in investigating possible deficiencies in processes of care and implementing change as required.

It must be stated clearly that the PSMI program is in its early stages and there is still much to be explored and learnt. The Department will continue to explore and learn with the assistance of its clinical partners.

  • Andrew L L Clarke1
  • William Shearer2
  • Alison J McMillan1
  • Paul D Ireland1

  • 1 Quality, Safety and Patient Experience Branch, Victorian Department of Health, Melbourne, VIC.
  • 2 Quality Unit, Southern Health, Melbourne, VIC.



Acknowledgements: 

The PSMI reference group: Robyn Wright; Fiona Webster; Susan Gervasoni; David Plueckhahn; Tony Poskus; Marrianne Beaty; Jigi Lucas; Merrin Prictor; Joanne Rash; Jill Butty; Nathan Farrow; Kirrily Gilchrist; Sue Goonan; Fiona Watson; Justin King; Linda Miln; Lynne Rigg; Anthony Gust; Anne Maddock; Alison Rule; Associate Professor Caroline Brand; Jennie Shepheard; Simon Waters; Dr Vijaya Sundararajan; Dr Sue Evans.

Competing interests:

None identified.

  • 1. Australian Commission on Safety and Quality in Health Care. The proposed national safety and quality framework. http://www.safetyandquality.gov.au/internet/safety/publishing.nsf/Content/C774 AE55079AFD23CA2577370000783C/$File/22177-NSQF.pdf (accessed Aug 2010).
  • 2. Victorian Government Department of Human Services. Victorian clinical governance policy framework: enhancing clinical care. Melbourne: DHS, 2009.
  • 3. McConchie S, Shepheard J, Waters S, et al. The AusPSI: the Australian version of the Agency of Healthcare Research and Quality patient safety indicators. Aust Health Rev 2009; 33: 334-341.
  • 4. Duckett SJ, Coory M, Sketcher-Baker K. Identifying variations in quality of care in Queensland hospitals. Med J Aust 2007; 187: 571-575. <MJA full text>
  • 5. Boston C. The concept of formative assessment. Pract Assess Res Eval [internet] 2002; 8 (9). http://PAREonline.net/getvn.asp?v=8&n=9 (accessed Mar 2010).
  • 6. Schwartz RM, Gagnon DE, Muri JH, et al. Administrative data for quality improvement. Pediatrics 1999; 103 (1 Suppl E): 291-301.
  • 7. Clinical Practice Improvement Centre. VLADs for dummies. Queensland Health ed. Brisbane: Wiley Publishing Australia, 2008.
  • 8. Scott I, Phelps G. Measurement for improvement: getting one to follow the other. Intern Med J 2009; 39: 347-351.
  • 9. Coory M, Duckett S, Sketcher-Baker K. Using control charts to monitor quality of hospital care with administrative data. Int J Qual Health Care 2008; 20: 31-39.

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.