Connect
MJA
MJA

Solving the information overload problem: a letter from Canada

David A Davis, Laure Perrier, Ileana Ciurea and Tanya M Flanagan, on behalf of the Ontario Guidelines Advisory Committee*
Med J Aust 2004; 180 (6): S68. || doi: 10.5694/j.1326-5377.2004.tb05952.x
Published online: 15 March 2004

Abstract

  • Doctors are inundated with medical information, some inadequately evidence-based, much of it captured in clinical practice guidelines (CPGs).

  • The Ontario Guidelines Advisory Committee (GAC) selects topic areas, searches for all CPGs on the topic, and reviews them using the AGREE Instrument.

  • Based in large part on the AGREE score, the GAC summarises one guideline in each topic area and mounts it on its website, with links to other information (eg, clinical algorithms) where possible.

  • Two topic areas have been selected for implementation — the reduction of unnecessary preoperative testing and the rational management of acute low back pain.

  • Implementation strategies include performance feedback, training of opinion leaders, development of algorithms and reminders, and communication through journals and continuing medical education activities.

The gap between what doctors might do (based on evidence-based clinical practice guidelines [CPGs]) and what they actually do is wide, variable1 and growing. Many factors contribute to this situation. Doctors are inundated with new, often poorly evidence-based and sometimes conflicting clinical information. This is particularly serious for the generalist, with over 400 000 articles added to the biomedical literature each year. Adding further pressure to the “gap” are workloads that have increased over the past decade: doctors are seeing more patients with acute and complex conditions.2 Canadian medical practitioners feel that they are on a “medical treadmill”, working an average of 53.8 hours per week.3 Rural practitioners work even longer hours, offer more medical services and perform more clinical procedures than their urban counterparts4 — thus facing an even greater need for up-to-date information.

Compounding this problem is another: there is good evidence that what we do in continuing medical education (CME) (produce courses, give didactic lectures, mail unsolicited printed materials) is not very effective in changing physician behaviour.5-7 Interventions that do show promise (such as reminders at the point of care, assistance of opinion leaders, academic detailing, feedback on performance) are uncommon and not well used by policymakers, CME providers and others.

Thus, doctors are faced with many challenges to practising optimal care. There is too much, often conflicting, information that is not easily digestible, insufficiently evidence-based, and not delivered in a timely, effective or coordinated manner. To address some of these challenges, the Ontario Guidelines Advisory Committee (GAC) has adopted a best-practice guideline strategy to support doctors in their endeavours to practise high-quality care. Here we briefly describe the GAC’s evidence-based guideline search, review and endorsement processes across the spectrum of clinical practice. We then focus on two specific clinical areas to illustrate the type of work we do to promote appropriate practice performance.

Solving the message problem: guideline search, review, endorsement and synopsis

Formed in 1997, the GAC is a joint body of the Ontario Medical Association and the Ontario Ministry of Health and Long-Term Care, with representation from the Institute for Clinical Evaluative Sciences. A relatively small body, its mission is to implement as well as select and review CPGs. To fulfil its mission it has also added a long list of other stakeholders in the dissemination of information and assessment of outcomes in the province, called the “Guideline Collaborative”.

Guideline topics include a wide variety of primary-care topics generated by the committee members and by sections within the Ontario Medical Association, as well as more specialised topics (eg, the appropriate use of echocardiography, pelvic ultrasound and hyperbaric oxygen therapy). Once a clinical topic is identified, a systematic search of medical literature databases and Internet-based guideline sites is conducted. Guidelines so identified are then sent to trained physician reviewers throughout the province for peer review, using a validated assessment tool, the AGREE (Appraisal of Guidelines Research and Evaluation) instrument.8 Currently, the GAC has a bank of over 50 trained doctors to assist in the guideline review process. Each guideline is assessed by a minimum of three reviewers; guideline assessments are aggregated and given an “apple” rating — four apples denoting an excellent guideline. The review process allows for an evidence-based method to assess the quality of CPGs, identifies potential bias in the guideline development process and ensures that the recommendations are both internally and externally valid. The AGREE instrument is also useful in critically evaluating the methods used for developing the guidelines, the content of the final recommendations, and the factors linked to their uptake.

Using these assessments, the GAC further reviews the guidelines based on its medical expertise, its knowledge of the Ontario healthcare system and the recency of the guideline. The Committee then recommends the most timely, relevant and evidence-based CPGs for uptake by doctors in Ontario. As at September 2003, GAC medical reviewers had reviewed 448 individual guidelines, and 56 best-practice guidelines had been endorsed.

To help with the time challenges facing Ontario doctors, the GAC summarises recommended CPGs into usable one- to two-page summaries (posted on the GAC website9 and featured monthly in the Ontario Medical Review, published by the Ontario Medical Association). The GAC website also provides electronic links to clinical decision tools, algorithms and patient educational materials, where they exist. As an additional method of raising physician awareness of GAC recommendations, the GAC showcases a tabletop presentation of its work at many functions organised by the Ontario Medical Association and other partner organisations.

Solving the message-delivery problem: an integrated approach to implementing guidelines

The GAC and its partner organisations (Box 1) have undertaken a plan to implement best practice in two areas: (a) preoperative testing, and (b) managing acute low back pain. Both plans recognise that, although single guideline education initiatives are most common, simple to plan and least expensive to implement, multiple, concurrent guideline implementation activities are more effective.6,10 The methods used in each of the two clinical areas are shown in Box 2.

Preoperative testing

Routine preoperative electrocardiograms and chest x-rays are commonly conducted despite little evidence of benefit,11 especially in low-risk patients and patients having low-risk procedures such as cataract surgery. The GAC has endorsed two preoperative testing guidelines that indicate that routine preoperative testing provides no benefit to patients and contributes to millions of wasted healthcare dollars annually.12,13

Currently, the GAC is focused on implementing these guidelines using a several-pronged approach involving feedback, opinion leaders, CME and a clinical algorithm.

Feedback: There is evidence that providing doctors with feedback on their individual performance compared with that of a peer group is an effective means of changing physician behaviour.14-16 Accordingly, the Institute for Clinical Evaluative Sciences reviewed the usage rates of preoperative electrocardiograms and chest x-rays for selected high-volume surgical procedures of low to intermediate risk. Feedback profiles were mailed to hospitals in May 2003. The feedback data included hospital-specific usage rates for preoperative chest x-rays and electrocardiograms, hospital peer-group usage rates for preoperative chest x-rays and electrocardiograms, and a hospital peer-group benchmark rate of testing. All data were handled in a secure environment under strict confidentiality provisions.

Opinion leaders: The GAC has trained doctors as opinion leaders — individuals who reside and work in a community, and to whom people often turn for informal support or advice.17 Opinion leaders include family physicians, anaesthesiologists, hospital administrators and others. Following their training, the GAC continues to support these individuals throughout the province as they work towards changing practice in their local communities.

Continuing medical education: The GAC is working with the continuing education divisions of the five Ontario medical schools to implement the GAC-recommended preoperative testing guidelines and other GAC-endorsed guidelines into existing CME activities and events. Having established partnerships with the CME departments of the five medical schools, the GAC aims to facilitate the provision of evidence-based preoperative testing guidance to all physicians in Ontario through a consistent clinical message.

Clinical algorithm: The GAC has developed and circulated to provincial hospitals an acceptable clinical algorithm to function as a reminder — a tool of some importance in changing provider behaviour.18

Managing acute low back pain

The key messages in caring for patients with acute low back pain — watch out for “red flags” denoting infection, tumours and other comorbidities; encourage physical activity; minimise the use of painkillers and other medications; avoid x-rays of the lumbosacral spine — are well known but not universally adopted as practice guidelines.19,20 Consequently, the GAC has endorsed a CPG in this area.21 Furthermore, following a meeting of its key stakeholders and other bodies (the Workers’ Safety and Insurance Board, the Institute for Work and Health), the GAC has endorsed (though not yet fully implemented) a coordinated implementation plan. The Committee borrows from the evidence regarding effective continuing education and uses a multifaceted approach. Among other interventions (Box 2), the CME divisions of the five medical schools and other providers of CME, such as the Ontario College of Family Physicians, the Foundation for Medical Practice and the Royal College of Physicians and Surgeons of Canada, will assist in the deployment of large and small group CME events and activities.

Evaluating the effectiveness of the GAC guideline implementation process

In making any evaluative comments about the GAC process and plans, it is necessary to be cautious about interpreting the impact of the project. First, the manner in which guideline review and endorsement is undertaken is imperfect, including the application of the AGREE instrument. Second, the research on which the guideline implementation and endorsement strategies is based is imperfect at best,22 and the way in which we have been able to roll out the plans has been moulded by practical and practice realities, well beyond the scope of our project, and not easily amenable to formal, objective evaluation. Third, the measures we intend to use to assess the impact of our efforts — a province-wide pre- and post-guideline-implementation data analysis of hospital utilisation of routine preoperative electrocardiograms and chest x-rays and of lumbosacral spinal x-rays — have not yet been completed. Rates of change in the use of these tests will be used as a proxy to measure the GAC’s success at implementing guidelines and changing physicians’ behaviour, but we recognise that these are crude measures, insensitive to many practice and practitioner realities.

Nonetheless, through the work of the GAC, the Ontario Ministry of Health and Long-Term Care and the Ontario Medical Association have demonstrated their commitment to the physicians of Ontario and to the endorsement of useful and provincially approved guidelines. Further, through its implementation planning and execution, the Ontario Guideline Collaborative has engendered goodwill and garnered support for an evidence-based guideline endorsement and implementation program. Perhaps the clearest sign of success is that the GAC and its collaborative efforts exist at all, aiming to improve the adoption of best evidence by Ontario’s medical practitioners.

  • David A Davis1
  • Laure Perrier2
  • Ileana Ciurea3
  • Tanya M Flanagan4
  • on behalf of the Ontario Guidelines Advisory Committee*

  • 1 Continuing Education, Faculty of Medicine, University of Toronto, Toronto, ON, Canada.
  • 2 Ontario Guidelines Advisory Committee, Ontario Ministry of Health and Long-Term Care and Ontario Medical Association, Toronto, ON, Canada.


Correspondence: 

Competing interests:

None identified.

  • 1. Bero LA, Grilli R, Grimshaw JM, et al. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ 1998; 317: 465-468.
  • 2. Chan BTB. The declining comprehensiveness of primary care. CMAJ 2002; 166: 429-434.
  • 3. Martin S. More hours, more tired, more to do: results from the CMA’s 2002 Physician Resource Questionnaire. CMAJ 2002; 167: 521-522.
  • 4. Slade S, Busing N. Weekly work hours and clinical activities of Canadian family physicians: results of the 1997/98 National Family Physician Survey of the College of Family Physicians of Canada. CMAJ 2002; 166: 1407-1411.
  • 5. Davis D, O’Brien MA, Freemantle N, et al. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA 1999; 282: 867-874.
  • 6. Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance. A systematic review of the effect of continuing medical education strategies. JAMA 1995; 274: 700-705.
  • 7. Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ 1995; 153: 1423-1431.
  • 8. The AGREE Collaboration. Appraisal of Guidelines Research and Evaluation. Available at: www.agreecollaboration.org (accessed Feb 2004).
  • 9. Ontario Medical Assocation and Ontario Ministry of Health and Long-Term Care. GAC: Guidelines Advisory Committee. Recommended clinical practice guidelines. Available at: www.gacguidelines.ca (accessed Feb 2004).
  • 10. Mazmanian PE, Davis DA. Continuing medical education and the physician as a learner: guide to the evidence. JAMA 2002; 288: 1057-1060.
  • 11. Munro J, Booth A, Nicholl J. Routine preoperative testing: a systematic review of the evidence. Health Technol Assess 1997; 1: i-iv, 1-62.
  • 12. Eagle KA, Chair. ACC/AHA Guideline update on perioperative cardiovascular evaluation for noncardiac surgery. A report of the American College of Cardiology/American Heart Association Task Force on Practice Guidelines (Committee to Update the 1996 Guidelines on Perioperative Cardiovascular Evaluation for Noncardiac Surgery). 2002. Available at: www.americanheart.org/downloadable/heart/1013454973885perio_update.pdf (accessed Feb 2004).
  • 13. Gander, L. Selective chest radiography: guidelines review. Prepared for the Health Services Utilization and Research Commission, Saskatchewan. 2000. Available at: www.hsurc.sk.ca/resource_centre/clinical/Chest%20radiography%20guideline%20review.pdf (accessed Feb 2004).
  • 14. Kiefe CI, Allison JJ, Williams OD, et al. Improving quality improvement using achievable benchmarks for physician feedback: a randomized controlled trial. JAMA 2001; 285: 2871-2879.
  • 15. Jamtvedt G, Young JM, Kristoffersen DT, et al. Audit and feedback: effects on professional practice and health care outcomes. Cochrane Database Syst Rev 2003; (3): CD000259.
  • 16. Thomson O’Brien MA, Oxman AD, et al. Audit and feedback versus alternative strategies: effects on professional practice and health care outcomes. Cochrane Database Syst Rev 2000; (2): CD000260.
  • 17. Stross JK. The educationally influential physician. Journal of Continuing Medical Education in the Health Professions 1996; 16: 167-172.
  • 18. Stone TT, Kivlahan CH, Cox KR. Evaluation of physician preferences for guideline implementation. Am J Med Qual 1999; 14: 170-177.
  • 19. Atlas SJ, Deyo RA. Evaluating and managing acute low back pain in the primary care setting. J Gen Intern Med 2001; 16: 120-131.
  • 20. Patel AT, Ogle AA. Diagnosis and management of acute low back pain. Am Fam Physician 2000; 61: 1779-1786, 1789-1790.
  • 21. Institute for Clinical Systems Improvement. Low back pain, adult. 2003. Available at: www.icsi.org/knowledge/detail.asp?catID=29&itemID=149 (accessed Feb 2004).
  • 22. Grimshaw J, Campbell M, Eccles M, Steen N. Experimental and quasi-experimental designs for evaluating guideline implementation strategies. Fam Pract 2000; 17 Suppl 1: S11-S16.

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.