Connect
MJA
MJA

Revalidation is not to be feared and can be achieved by continuous objective assessment

Stephen N Bolsin, Elizabeth Cawson and Mark E Colson
Med J Aust 2015; 203 (3): 142-144. || doi: 10.5694/mja14.00081
Published online: 3 August 2015

Summary

  • Revalidation is defined by the International Association of Medical Regulatory Authorities as “the process by which doctors have to regularly show that they are up to date, and fit to practice medicine”.
  • In December 2012, the General Medical Council in the United Kingdom introduced revalidation processes that involve medical practitioners collecting a portfolio of evidence for assessment and appraisal by a “responsible officer”.
  • The responsible officer is usually the medical director of the hospital or group of primary care providers and reports directly to the General Medical Council on the fitness of the doctor to practice in their current role. The time taken to collect and analyse the portfolio and sources available are all contentious issues, along with the cost of the revalidation process.
  • We propose that effective revalidation processes based on performance measurement would be cost-effective and, if correctly applied, could lead to significant cost savings in Australian health care.
  • The driving force for an effective and efficient revalidation process should be the professional and ethical responsibility that each doctor has to their patients and to the society which has granted them the right to practice.

Medical professionalism in the United Kingdom has been a contentious issue for over a decade.1,2 Lack of competence may contribute to patient harm and is, therefore, part of the debate involved in defining and assessing professional competence.3,4

The International Association of Medical Regulatory Authorities defines revalidation as “the process by which doctors have to regularly show that they are up to date, and fit to practice medicine”.5 On 3 December 2012, the UK General Medical Council (GMC) introduced revalidation procedures that require doctors to collect evidence of professional competence and fitness to practice in their current role. This evidence is presented by a “responsible officer”, who is usually the medical director of the hospital or primary care provider, directly to the GMC for appraisal; but more may be required.6 The time taken to collect and analyse the portfolio and sources available are all contentious issues, along with the cost of the revalidation process. To understand this bureaucratic drive to demonstrable competence and to identify mechanisms to achieve this laudable goal, it is necessary to understand the context in which these proposals were developed.

An unforeseen consequence of health care’s increasing complexity is the identification of systemic health care error as a cause of patient harm and unnecessary cost.7 Safety experts calculate the cost of error in Australia is “over $1 billion — possibly $2 billion — annually” with 50% of errors potentially preventable.8 Australian health care expenditure was $130.3 billion in the 2010–11 financial year.9

Emerging evidence indicates that the rate of systemic health care error has not declined significantly since it was initially identified, although some limited trials have shown promise in improving outcomes presumed to result from failures of coordinated health care delivery.10-12 The medical profession should accept some responsibility for systemic health care error, particularly those errors which harm patients. We believe that the profession should be committed to rectifying current deficiencies and minimising future errors for ethical reasons, as well as the obvious reason of financial rectitude. The importance of an ethical component to the approach to patient safety is that it imposes an overarching imperative to guide professional behaviour.

Professional ethics are the rules or guidelines that dictate professional behaviour. They are related to, but not inseparable from, the morality of the society in which the profession practices. In 1999, the Tavistock Group of medical ethicists proposed that minimising errors, minimising unnecessary and inappropriate variation in practice and a continuing responsibility to help improve quality were important professional ethical principles.13 These principles were further endorsed by Peter Singer (professor of bioethics) in his review of medical ethics in 2000 and they remain applicable today.14

In this article, we examine how an ethical and professional commitment to reducing errors, adhering to best practice and improving quality of care should be reflected in the training and practice of competent doctors. One obvious mechanism to embed such competent, professional practice is through the process of revalidation once doctors have completed training.4,13,15 There is little doubt that patients would request and respect such a revalidation process if it provided them with less harmful, less costly, more available and higher-quality health care. This expectation is recognised by experienced medical educators and is implicit in the definition of professional competence published in 2002 by Ronald Epstein (professor of family medicine, psychiatry, oncology and nursing) and Edward Hundert (dean for medical education and professor of global health, social medicine and medical education): “the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and community being served”.16 This thoughtful and inclusive definition from the United States exceeds current definitions from Europe. The essential facets of competence (FOC) proposed by Dutch and German medical educators in 2013 identifies six more limited professional competencies,17 shown in Box 1. These FOC overlap the GMC’s four domains of expected standards of practice,18 shown in Box 2.

The potential for broad, societal input into the processes defining medical competence is highlighted by the different emphases of the various definitions: the science, empiricism and professional development of the Dutch and German medical educators’ FOC; the reflection on daily practice and consideration of community benefit in the US definition; and the inclusion of the patient and the public in the GMC’s domains. Including the views of patients and the lay public in the revalidation process enables professionals to acknowledge community considerations of competence and revalidation in the society in which they are practising. It is possible to incorporate such broad definitions of professional competence into valid assessments of individual and organisational professional practice that will contribute to improved outcomes, systematic safety, reduced health costs and higher-quality care. The principles of these assessments were developed in the 1980s and are supported by good evidence. They fall into three broad groups, linked by the need for measurement or assessment of performance in all specialties.

First, collecting, analysing and providing feedback on outcome data at a unit level and an individual level will improve patient outcomes.19-21 Although in New York State the performance monitoring exercise commencing in the late 1980s was originally labelled “report cards” and viewed with suspicion, the feedback of risk-adjusted performance data in surgical specialties is now an accepted means of ensuring the delivery of high-quality clinical services in many countries, including Australia.20,22,23 Such data collections must be physician led and conducted in a non-threatening manner to ensure successful adoption.19,20,22,23

Second, specialist registries, which serve a similar function, have proved valuable as voluntary data collections that can be used to review practice, act as centre report cards and generate hypotheses for randomised trials.24 Medical practitioners who self- report their performance are already reflecting on their practice and are therefore more closely aligned with modern principles of medical education and revalidation.4,16 Such voluntary examinations of practice take the profession beyond the New York State performance monitoring exercise for providers of cardiac services that started in 1988 as a compulsory data collection. Contributing data to the state database was a licencing requirement for providers of cardiac services in New York State. However, these registries also represent evidence of “the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and community being served”.16 This can be seen as competence at the coalface, which should contribute to revalidation in a very practical way.16,25-28 This routine collection of personal and unit performance data would render unnecessary retrospective examinations of performance at the time of perceived, or actual, problems with patient safety, or questions pertaining to a practitioner’s competence.29,30 Although the Bristol heart scandal and the Harold Shipman murders relate to the National Health Service in the UK, the effect on the Australian public of the Jayant Patel case in Bundaberg should not be underestimated and any mechanism for preventing recurrence should be carefully examined.2,29

Third, the application of a combination of technology and statistical analysis to monitoring individual performance facilitates high-quality, objective performance monitoring and is easily achieved for practical procedures.25 These advances represent an enormous opportunity for the objective measurement of competence and quality in health care.4 This level of professionalism has been advocated by some European medical educators as a new standard for “entrustable professional activities” as a means of addressing medical professional competence.31 This component of objective analysis of professional performance has proved valuable when coupled with the use of statistical methods to define and confirm competence. These methods were pioneered in paediatric cardiac surgery and subsequently applied in other specialities.4,21,25,32 We are convinced that the routine collection of these data during training and subsequent specialist practice makes revalidation easier and even irrelevant.4 The reason for suggesting that revalidation may become irrelevant is not to be controversial but to emphasise the value of routine performance monitoring. The collection of appropriate data in routine practice supports the revalidation process more accurately and objectively than the current processes suggested.4 In fact, only those practitioners not collecting performance data may need to undergo a formal revalidation process because objective evidence of good practice would not be available for them.4,6

Revalidation based on the UK model may not fulfil all the requirements of the medical profession in Australia. However, the broader definition of competence — including reflection on practice, consideration of community benefit, or protecting the health of the patient and the public — arising from the US and the UK indicates that the process of performance monitoring is a much better tool for effectively identifying and supporting contributors to poor health care performance. The evidence that the collection of performance data and feedback of results improves unit and individual clinician performance, thereby reducing patient mortality and morbidity, represents examples of a tangible commitment to reflective practice and patient and public health.19,20,33 Such professional activity provides a clear example of determination to achieve demonstrable competence. Participation in such data collections may be more valuable to patients, the medical profession and the bottom line than the more formulaic aspects of existing or proposed revalidation processes.

With respect to the unnecessary and inappropriate variation of practice and its impact on patient safety, there is now very good evidence that treatment which follows the guidelines of professional bodies ensures better outcomes than treatment that omits important components of care.34 The high-quality data for this study was taken from a randomised controlled trial of the management of acute coronary syndromes (ACS), including acute myocardial infarction (AMI), in the US (the CRUSADE study). It confirmed that a 10% increase in process compliance with American College of Cardiology guidelines conferred a 10% decrease in mortality for patients with ACS and AMI. Collection of this process adherence data alongside personal performance data would improve the quality of revalidation assessments at multiple levels in health care — for example, at organisational (hospital), unit (team) and individual (doctor) levels. Coupling the collection of unit and individual performance data with supervised feedback of results should optimise assessments of individuals in their workplace. This will advance the goal of objective, measured competence confirming “the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and community being served”.16

Through these processes, we believe that the medical profession can lead other health care professions to achieve a level of training and maintained competence that exceeds the current formulae for revalidation. Such processes of personal professional monitoring in medicine, through self-reporting, can confirm objectively and routinely the value of the health interventions of the medical profession in maintaining the health of the individual patient and the public. If the profession can take that lead, it will be well placed to ensure that the health interventions of the future will continue to be of higher quality, deliver optimal benefit and become more affordable to the populations in which they practice and the communities that they serve.

1  Essential facets of competence*

  • Scientific and empirical grounded method of working
  • Knowing and maintaining own personal bounds and possibilities
  • Active professional development
  • Teamwork and collegiality
  • Active listening to patients
  • Verbal communication with colleagues and supervisors

2  General Medical Council’s four domains of expected standards of practice*

1. Knowledge, skills and performance

  • Make the care of your patient your first concern
  • Provide a good standard of practice and care
    • Keep your professional knowledge and skills up to date
    • Recognise and work within the limits of your competence

2. Safety and quality

  • Take prompt action if you think that patient safety, dignity or comfort is being compromised
  • Protect and promote the health of patients and the public

3. Communication, partnership and teamwork

  • Treat patients as individuals and respect their dignity
    • Treat patients politely and considerately
    • Respect patients’ right to confidentiality
  • Work in partnership with patients
    • Listen to, and respond to, their concerns and preferences
    • Give patients the information they want or need in a way they can understand
    • Respect patients’ right to reach decisions with you about their treatment and care
    • Support patients in caring for themselves to improve and maintain their health
  • Work with colleagues in the ways that best serve patients’ interests

4. Maintaining trust

  • Be honest and open and act with integrity
  • Never discriminate unfairly against patients or colleagues
  • Never abuse your patients’ trust in you or the public’s trust in the profession

Provenance: Not commissioned; externally peer reviewed.

  • Stephen N Bolsin
  • Elizabeth Cawson
  • Mark E Colson

  • Geelong Hospital, Geelong, VIC


Competing interests:

No relevant disclosures.

  • 1. Smith R. All changed, changed utterly. British medicine will be transformed by the Bristol case. BMJ 1998; 316: 1917-1918.
  • 2. Irvine DH. New ideas about medical professionalism. Med J Aust 2006; 184: 204-205. <MJA full text>
  • 3. Ellis FR. Measurement of competence. Br J Anaesth 1995; 75: 673-674.
  • 4. Bolsin SN, Colson ME. Measuring competence makes revalidation easier. Neurourol Urodyn 2013; 32: 968.
  • 5. International Association of Medical Regulatory Authorities. Revalidation [definition]. http://www.iamra.com/glossary#revalidation (accessed Jul 2015).
  • 6. Walshe K, Archer J. Medical regulation: more reforms are needed. BMJ 2014; 349: g5744.
  • 7. Wilson RM, Runciman WB, Gibberd RW, et al. The Quality in Australian Health Care Study. Med J Aust 1995; 163: 458-471.
  • 8. Armstrong BK, Gillespie JA, Leeder SR, et al. Challenges in health and health care for Australia. Med J Aust 2007; 187: 485-489. <MJA full text>
  • 9. Australian Institute of Health and Welfare; . Health expenditure Australia 2010–11. Canberra: AIHW, 2012: 6. (AIHW Cat No. HWE 56; Health and Welfare Expenditure Series No. 47.) http://www.aihw.gov.au/WorkArea/DownloadAsset.aspx?id=10737423003
  • 10. Wilson RM, Van Der Weyden MB. The safety of Australian healthcare: 10 years after QAHCS. Med J Aust 2005; 182: 260-261. <MJA full text>
  • 11. Wang Y, Eldridge N, Metersky ML, et al. National trends in patient safety for four common conditions, 2005-2011. N Engl J Med 2014; 370: 341-351.
  • 12. McCannon J, Berwick D. A new frontier in patient safety. JAMA 2011; 305: 2221-2222.
  • 13. Smith R, Hiatt H, Berwick D. Shared ethical principles for everybody in health care: a working draft from the Tavistock Group. BMJ 1999; 318: 248-251.
  • 14. Singer PA. Recent advances. Medical ethics. BMJ 2000; 321: 282-285.
  • 15. Oakley J, Clarke S. Introduction: accountability, informed consent and clinical performance information. In: Clarke S, Oakley J, editors. Informed consent and clinician accountability: the ethics of report cards on surgeon performance. Cambridge: Cambridge University Press, 2007: 1-21.
  • 16. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA 2002; 287: 226-235.
  • 17. Wijnen-Meijer M, van der Schaaf M, Nillesen K, et al. Essential facets of competence that enable trust in medical graduates: a ranking study among physician educators in two countries. Perspect Med Educ 2013; 2: 290-297.
  • 18. General Medical Council. Good medical practice: the duties of a doctor registered with the General Medical Council. http://www.gmc-uk.org/guidance/good_medical_practice/duties_of_a_doctor.asp (accessed Feb 2015).
  • 19. Hannan EL, Kilburn H, Racz M, et al. Improving the outcomes of coronary artery bypass surgery in New York State. JAMA 1994; 271: 761-766.
  • 20. O’Connor GT, Plume SK, Olmstead EM, et al. A regional intervention to improve the hospital mortality associated with coronary artery bypass graft surgery. The Northern New England Cardiovascular Disease Study Group. JAMA 1996; 275: 841-846.
  • 21. de Leval MR, Francois K, Bull C, et al. Analysis of a cluster of surgical failures. Application to a series of neonatal arterial switch operations. J Thorac Cardiovasc Surg 1994; 107: 914-923.
  • 22. Green J, Wintfeld N. Report cards on surgeons. Assessing New York State’s approach. N Engl J Med 1995; 332: 1229-1232.
  • 23. Reid CM, Rockell M, Skillington PD, et al. Initial twelve months experience and analysis for 2001-2002 from the Australasian Society of Cardiac and Thoracic Surgeons–Victorian Database Project. Heart Lung Circ 2004; 13: 291-297.
  • 24. McDonald SP, Russ GR. Australian registries - ANZDATA and ANZOD. Transplant Rev (Orlando) 2013; 27: 46-49.
  • 25. Bent PD, Bolsin SN, Creati BJ, et al. Professional monitoring and critical incident reporting using personal digital assistants. Med J Aust 2002; 177: 496-499. <MJA full text>
  • 26. Bolsin S, Colson M. Making the case for personal professional monitoring in health care. Int J Qual Health Care 2003; 15: 1-2.
  • 27. Hannan EL, Kumar D, Racz M, et al. New York State’s Cardiac Surgery Reporting System: four years later. Ann Thorac Surg 1994; 58: 1852-1857.
  • 28. Colson M, Bolsin S. The use of statistical process control methods in monitoring clinical performance. Int J Qual Health Care 2003; 15: 445.
  • 29. Spiegelhalter D, Grigg O, Kinsman R, Treasure T. Risk-adjusted sequential probability ratio tests: applications to Bristol, Shipman and adult cardiac surgery. Int J Qual Health Care 2003; 15: 7-13.
  • 30. Bolsin S, Barach P. The role and influence of public reporting of pediatric cardiac care outcome data. Prog Pediatr Cardiol 2012; 33: 99-101.
  • 31. ten Cate O. Entrustability of professional activities and competency-based training. Med Educ 2005; 39: 1176-1177.
  • 32. Lim TO, Soraya A, Ding LM, Morad Z. Assessing doctor’s competence: application of CUSUM technique in monitoring doctors’ performance. Int J Qual Health Care 2002; 14: 251-258.
  • 33. Aylin P, Bottle A, Jarman B, Elliott P. Paediatric cardiac surgical mortality in England after Bristol: descriptive analysis of hospital episode statistics 1991-2002. BMJ 2004; 329: 825-827.
  • 34. Peterson ED, Roe MT, Mulgund J, et al. Association between hospital process performance and outcomes among patients with acute coronary syndromes. JAMA 2006; 295: 1912-1920.

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

access_time 04:42, 26 August 2015
Abdul Qadir Imran

I read with interest the recent article by Bolsin et al. entitled ‘Revalidation is not to be feared and can be achieved by continuous objective assessment’.(1) This is a well-written piece articulating three principles of assessments - the collection and analysis of unit and individual performance data with supervised feedback by health service and specialist registries, and the use of statistical analysis to monitor performance data.(1)

I agree with the authors that ‘revalidation based on the UK model may not fulfil all the requirements….in Australia’.(1) In the UK, revalidation was introduced partly with the aim of restoring public trust in the medical profession following several regulatory failures e.g. Bristol Infirmary & Harold Shipman cases.(2) Revalidation involves every doctor in independent practice participating in an annual appraisal and undergoing a more detailed assessment every 5 years.(2)

The Medical Board of Australia (MBA) first broached the idea of revalidation in 2012.(2) Earlier this year, the MBA commissioned an international research into revalidation in Australia. However, like Breen, I believe that ‘there has been no widespread loss of faith of the community either in its doctors or in the regulatory system’.(2)

According to a landmark study by Bismark et.al, 3% of Australia’s medical workforce accounted for 49% of complaints and 1% accounted for a quarter of complaints.(3) Bismarck et al. also found it feasible to predict which doctors are at high risk of incurring more complaints in the near future.(3) Instead of introducing an expensive revalidation system enforcing a ‘blanket-rule’ for all doctors, I believe it would be more cost-effective to undertake targeted revalidation of high-risk doctors, and re-evaluate & enhance our current registration and accreditation system.

References

1. Bolsin S, Cawson E, Colson M. Revalidation is not to be feared and can be achieved by continuous objective assessment. Med J Aust [Internet]. 2015 Aug [cited 2015 Aug 25];203 (3): 142-144. Available from: https://www.mja.com.au/journal/2015/203/3/revalidation-not-be-feared-and-can-be-achieved-continuous-objective-assessment DOI: 10.5694/mja14.00081
2. Breen K. Revalidation – what is the problem and what are the possible solutions? Med J Aust [Internet]. 2014 [cited 2015 Aug 25]; 200 (3): 153-156. Available from: https://www.mja.com.au/journal/2014/200/3/revalidation-what-problem-and-what-are-possible-solutions DOI: 10.5694/mja13.11261
3. Bismark M, Spittal M, Gurrin L, Ward M, Studdert D. Identification of doctors at risk of recurrent complaints: a national study of healthcare complaints in Australia. BMJ Qual Saf [Internet]. Published Online First: 2013 Apr 10 [cited 2015 Aug 25]. Available from: http://qualitysafety.bmj.com/content/early/2013/02/22/bmjqs-2012-001691.full DOI:10.1136/bmjqs-2012-001691


Competing Interests:

Dr Abdul Qadir Imran
Western Health, Victoria

access_time 02:29, 3 September 2015
David Langton

In August 2015, MJA published a thought provoking article by Stephen Bolsin et al entitled “Revalidation is not to be feared and can be achieved by continuous objective assessment”. I wanted to share with the readership a very personal example of the positive benefits of revalidation assessment.

Several years ago I had a series of highly traumatic work-related encounters. After several months, I developed an anxiety disorder and was afraid even to answer the telephone. I seriously doubted my competence in caring for patients and whether I would be able to return to work.

To rebuild my confidence, I attended the European Summer School in Respiratory Medicine, a five day intensive study programme. Later the same year, I sat the European Diploma of Respiratory Medicine examination (1), which sets the benchmark for respiratory practice in Europe. It’s a clinically orientated, but testing three hour written examination.

The boost to my psyche from the knowledge that I was proven to be up to date in my discipline by an independent examination set by my peers was invaluable, and contributed substantially to my recovery. It certainly demonstrated to me that revalidation is not to be feared – but quite the reverse, it was a rewarding experience.

References
1 http://www.hermes.ersnet.org

Competing Interests:

Assoc Prof David Langton
Peninsula Health, Victoria

Responses are now closed for this article.