The concept is attractive — but capacity may limit its practicality
For many years, Australia has relied on supplementing its medical workforce with doctors who have qualified outside Australia. Each year, about 2500 of these medical practitioners, known as international medical graduates (IMGs), seek general registration with the Medical Board of Australia. For many IMGs, this has included sitting the clinical examinations conducted by the Australian Medical Council (AMC) as part of the Standard Pathway for IMGs. The eligibility standard for registration is set at the expected level of an Australian medical graduate at the time they complete their internship.1 Concerns have been expressed about the accessibility of these examinations and the ability of IMG candidates to pass them. Some of these problems were highlighted during an inquiry in 2011–2012 by the House of Representatives’ Standing Committee on Health and Ageing, Lost in the labyrinth.2
Workplace-based assessments (WBAs) have been developed as alternatives to the Objective Structured Clinical Examination (OSCE) and other approaches for assessing clinical competence.3 They have the perceived advantage of allowing a single set of high stakes (summative) assessments in an examination environment to be replaced by multiple low stakes (formative) assessments conducted by supervising clinicians over a period of time and in the workplace. These methods have been progressively adopted in recent years by medical specialist colleges. The AMC commissioned a trial of WBAs as an alternative to their clinical examination in 2010, and have subsequently incorporated WBAs into the Standard Pathway.
An article by Nair and colleagues in this issue of the MJA evaluates the reliability of WBAs in this context.4 The usefulness of an assessment method relies on a number of psychometric criteria being fulfilled. These include the concepts of validity (does the assessment method reflect performance in practice?), reliability (is it reproducible and consistent?), feasibility (is it an efficient use of resources?), acceptability, and educational impact.5 WBAs are generally recognised as having high validity, as they are conducted in the workplace where the doctor is practising and are modelled on and executed as part of normal clinical practice.
Reports have previously been published about the feasibility and acceptability of WBAs for assessing IMGs, and on the reliability of a single method, the mini-clinical evaluation exercise (mini-CEX).6-8 A combination of assessment methods, however, allows different aspects of clinical practice to be assessed.3 The article by Nair and colleagues examined the reliability of such a combination (“composite reliability”), and found that it was good (reliability coefficient greater than 0.8).
Why is this important? In an examination process, reliability can be ensured by standardised processes, similar formats, and controlling the examiners, candidates and the examination environment. However, this often compromises the validity of the examination. The increased validity associated with WBAs, on the other hand, can affect reliability because of variations and pressures in the clinical environment, the fact that examinees are dealing with real patients, differences in assessment formats, and the lesser control over the candidates and assessors. A high degree of reliability for a combination of assessments indicates that candidates are being assessed in a standard and reproducible manner and to an equivalent standard of competence, comparable with the standard AMC examination process. The results reported by Nair and colleagues are, however, for a single program; for these findings to be generalised to other WBA programs for IMGs, further research may be required.
What do these results mean for assessing IMGs in the future? In 2015, 84 candidates participated in the WBA program, and 76 completed it successfully. In comparison, changes to the AMC processes and the establishment of the Vernon C. Marshall National Test Centre in Melbourne have allowed 2000 candidates to attempt the clinical examination over the same period (with about 590 completing it successfully).9 While WBAs have been shown to be feasible, affordable and reliable, they require resourcing and a commitment from the host institution. Further, candidates need to be recruited to a training position in a hospital offering the program. Investing in the program has rewarded some hospitals with improved recruitment and retention of practitioners, which is important in regional areas where this can be difficult. However, competition for training positions has increased; there are more than 3000 domestic medical graduates each year who need pre-vocational training.10
The AMC clinical examination offers access to a greater number of IMGs (possibly reflected in the lower rate of successful completions than for WBA). Australia continues to be an attractive destination for medical graduates from other countries, and the demand for assessment will therefore continue. WBAs will be a useful part of that assessment, but there are limits to the number of candidates who can be accommodated by this approach, especially when compared with the AMC examination.
Provenance: Commissioned; externally peer reviewed.
- 1. Medical Board of Australia. Granting general registration to medical practitioners in the standard pathway who hold an AMC certificate. March 2011. http://www.medicalboard.gov.au/documents/default.aspx?record=WD11%2f4691&dbid=AP&chksum=kgA7KRs4HJI1ugAz%2bjIcFg%3d%3d (accessed July 2016).
- 2. House of Representatives Standing Committee on Health and Ageing. Lost in the labyrinth: report on the inquiry into registration processes and support for overseas trained doctors. Canberra: Parliament of Australia, 2012. http://www.aph.gov.au/parliamentary_business/committees/house_of_representatives_committees?url=haa/overseasdoctors/report.htm (accessed July 2016).
- 3. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach 2007; 29: 855-871.
- 4. Nair BR, Moonen-van Loon JMW, Parvathy MS, van der Vleuten CPM. Composite reliability of workplace-based assessment for international medical graduates. Med J Aust 2016; 205: 212-216.
- 5. van der Vleuten C. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ 1996; 1: 41-67.
- 6. Nair BR, Hensley MJ, Parvathy MS, et al. A systematic approach to workplace-based assessment for international medical graduates. Med J Aust 2012; 196: 399-402. <MJA full text>
- 7. Nair BR, Searles AM, Ling RI, et al. Workplace-based assessment for international medical graduates: at what cost? Med J Aust 2014; 200: 41-44. <MJA full text>
- 8. Nair BR, Alexander HG, McGrath BP, et al. The mini clinical evaluation exercise (mini-CEX) for assessing clinical performance of international medical graduates. Med J Aust 2008; 189: 159-161. <MJA full text>
- 9. Australian Medical Council Limited. Annual report 2015. Canberra: AMC, 2016. http://www.amc.org.au/files/c41a674ba203450aa4dc4e2735b4a47ef28d7713_original.pdf (accessed July 2016).
- 10. Medical Training Review Panel. Eighteenth report. Canberra: Commonwealth of Australia, 2015. http://www.health.gov.au/internet/main/publishing.nsf/Content/work-pubs-mtrp-18 (accessed July 2016).
Publication of your online response is subject to the Medical Journal of Australia's editorial discretion. You will be notified by email within five working days should your response be accepted.