Objective: To measure specialist international medical graduates' (SIMGs) level of learning through participation in guided tutorials, face-to-face or through videoconferencing (VC), and the effect of tutorial attendance and quality of participation on success in specialist college examinations.
Design and setting: Tutorials were conducted at the Royal Brisbane and Women's Hospital between 19 September 2007 and 23 August 2010, and delivered through VC to participants at other locations. Tutorials were recorded and transcribed, and speaker contributions were tagged and ranked using content analysis software. Summary examination results were obtained from the Australian and New Zealand College of Anaesthetists.
Main outcome measures: Tutorial participation and attendance, and college examination pass and fail rates.
Results: Transcripts were obtained for 116 tutorials. The median participation percentage for those who subsequently failed the college examinations was 1% (interquartile range [IQR], 0%–1%), while for those who passed the exams it was 5% (IQR, 2%–8%; P < 0.001). There was also an association between attendance and exam success; the median (IQR) attendance of those who failed was 24% (IQR, 14%–39%), while for those who passed it was 59% (IQR, 39%–77%; P < 0.001).
Conclusions: Use of VC technology was found to be a feasible method to assist SIMGs to become aware of the requirements of the exam and to prepare more effectively.
Specialist international medical graduates (SIMGs) make a significant contribution to the provision of health care in , most profoundly in designated “areas of need”. Working in areas of need is usually a necessary visa requirement, and means that SIMGs, like other international medical graduates, are often professionally isolated from their peers.1 This isolation may inhibit their professional development.2 Location and other cultural factors are known to influence learning opportunities.3
Specialist medical colleges administer specialist training programs and examinations, including those for SIMGs. The examination and assessment procedures for SIMGs are the same as, or closely based on, those for local specialist trainees.
In the case of anaesthetists, the Australian and New Zealand College of Anaesthetists (ANZCA) performs the assessment of SIMGs' qualifications and experience, and compares these with the required standard for local specialist trainees at a similar level. The assessment is based on a paper application and an interview. Candidates' specialist experience is assessed in terms of casemix, use of equipment and drugs, and compliance with standards of good anaesthetic practice.4 The ANZCA decides whether the doctor is deemed to be either (i) substantially comparable, (ii) partially comparable or (iii) non-comparable to a newly qualified Fellow of the college. The college requirement for substantially and partially comparable SIMGs in anaesthesia is that they complete a period of satisfactory clinical practice and pass the performance assessment. Assessment for partially comparable applicants also includes a structured exam.
SIMGs in anaesthesia have traditionally performed poorly in the ANZCA specialist assessment process. To identify education-related reasons for this relative lack of success, a common metric is needed to quantify their preparatory efforts for the assessment.
In this research, we used content analysis techniques to measure candidates' level of learning through participation in group discussion, including long-distance candidate participation through videoconferencing (VC). A key purpose of this study was to obtain an accurate representation of peer interaction to assess its quality and content. We chose an automated approach for this investigation to circumvent known human errors associated with traditional content analysis, such as inconsistent interpretation and application of units of measurement.5 We used concept analysis software that works with uncoded document text, avoiding the concerns associated with reliance on a preconstructed data dictionary, such as attributing an incorrect meaning to text and interpreting it out of context. Our aim was not to assess the quality of the content discussed against criteria for academic achievement, but rather to measure the overall level of preparation for an academic assessment.
This study took place in Queensland, Australia, between 19 September 2007 and 23 August 2010. Remotely located SIMGs in anaesthesia were supported by adapting an existing Royal Brisbane and Women's Hospital (RBWH) exam preparation program for its registrars. Through this program, SIMGs preparing for specialist examinations were invited to take part in weekly 2-hour VC sessions, delivered in two tutorial blocks per year over the period of the study. A tutorial block ended about 6 weeks before college exams, which are held each year in May and September. Educational sessions were conducted from the RBWH and delivered through VC to participants at other locations. The program involved a facilitated tutorial, including exam question practice and guided discussion of exam topics. Tutorial questions were written by the tutor, who was a past examiner of the college. Both tutorial components involved verbal communication and interaction between candidates. The tutor helped learners to construct meaning from critical reflection on their assumptions, and promoted group discussion.6 The tutorials were recorded on DVD. The study was approved by the ethics committees of Queensland Health and the University of Queensland.
The recordings from each tutorial were transcribed, and computer-assisted content analysis was applied to the transcripts. The content of the tutorials was analysed using concept analysis software, Leximancer version 3.5 (Leximancer). This software is able to interpret all types of text, including short non-grammatical comments.7-10 Using this software ensured reproducibility of results. Each transcript was checked for accuracy by a second researcher. Speaker contributions were tagged by the software and ranked with a percentage proportional to the overall number of concepts. The resulting ranked list was used to interpret participation. Participation was then defined as the percentage of these concepts contributed by individual candidates. Attendance was measured as the percentage of potential sessions attended by the candidate.
Successful ANZCA exam candidates are not informed of their mark by the college, and marks were not available for this research. To assess the impact on SIMG pass rates, summary examination results were collected from published data presented in the quarterly ANZCA bulletins. Those who did not sit the specialist exams at the end of a tutorial block were classified as not successful. The association between concept measures and exam success was investigated using the Mann–Whitney U test.
Transcripts were obtained for 166 participants, from 116 recorded tutorials delivered at 31 sites. Of the total number of tutorials run, 19 were not transcribed because of technical difficulties. Fifty-four participants chose to attend tutorials more than once, with 27 attending one extra tutorial and 27 attending more than one extra tutorial. Therefore, overall, there were 232 attendances.
The Box displays the number of tutorial viewing sites, their locations across Australia and the number of participants at each site. The trainees at the main site at the RBWH consisted mainly of advanced registrars from within the metropolitan area of Brisbane as well as up to four SIMGs in anaesthesia. A small number of SIMGs from within the metropolitan area of Brisbane continued to travel to attend the tutorials face-to-face, stating that the tutorial experience was very different through VC. The technology did not, however, appear to hinder results.
Of the 166 tutorial program participants, 142 sat the exam. Overall, 103 participants passed: 87 on their first attempt, 14 after one failed attempt and two after two or more attempts. In 2006, the pass rate for SIMGs sitting the ANZCA final examinations was 40%.11 In September 2008, the exam period immediately after the introduction of the tutorial program, the pass rate for SIMGs who participated in the tutorials was 72%, whereas it remained at 41% for SIMGs who did not participate.12 The pass rate was 80% for local specialist trainees during the same period.
Tutorial participation was strongly associated with exam success. The median participation percentage for those who subsequently failed the college examinations was 1% (interquartile range [IQR], 0%–1%), while for those who passed the exams it was 5% (IQR, 2%–8%; P < 0.001). This was evident across six cohorts of SIMGs in anaesthesia. There was also an association between attendance and exam success; the median (IQR) attendance of those who failed was 24% (IQR, 14%–39%), while for those who passed it was 59% (IQR, 39%–77%; P < 0.001).
VC tutorials were found to be a feasible method to assist SIMGs to become aware of the requirements of the exam and to prepare more effectively, helping them succeed. We developed an objective measure of exam preparedness using a software tool that can be used to meaningfully interpret participation efforts in any similar group discussion forum. The user interface of the software is relatively simple to use, and offers useful insight into the level of participant learning. The tool has the potential to provide an immediate formative assessment, should advances in voice recognition technology reach a point where rapid and accurate transcription of discussion can be provided.
The content analysis software used makes contemporary use of information retrieval techniques that possibly may be considered out of step with accepted traditional content analysis techniques. The software may have potential for predicting outcomes from other medical educational methods that employ group discussion. Forum integrity can be maintained by the ability to measure relevant content discussed (participation). A limitation of the VC approach is occasional audio lag that may hamper the flow of discussion and contribute to a shorter response from a distant site. This can affect analysable text from distant participants.
Participation was regarded as a more useful predictor of examination success than attendance because it may provide additional information which attendance alone cannot provide. That is, it can provide the educator with a means of simultaneously assessing multiple exam candidates' understanding of concepts as well as allowing candidates to monitor their own level of preparation. A particular additional benefit is that it is feasible to use contemporary content analysis approaches to successfully provide ongoing assessment of geographically distant candidates' performance. The use of content analysis in specialist training programs could provide better information about candidate suitability and deserves consideration as a method to boost examination success rates.
Regional location of participants in college exam preparation tutorials
Tutorial sites* (n = 31)
Tutorial participants (n = 166)
1 (face-to-face tutorial at RBWH)
* The main tutorial site and all videoconference sites were located in hospitals. NSW = New South Wales. NT = Northern Territory. Qld = Queensland. RBWH = Royal Brisbane and Women's Hospital. SA = South Australia. Tas = Tasmania. Vic = Victoria. WA = Western Australia.
Received 7 November 2012, accepted 26 May 2013
- 1. Han G-S, Humphreys JS. Overseas-trained doctors in Australia: community integration and their intention to stay in a rural community. Aust J Rural Health 2005; 13: 236-241.
- 2. Higgins NS, Taraporewalla K, Steyn M, et al. Workforce education issues for international medical graduate specialists in anaesthesia. Aust Health Rev 2010; 34: 246-251.
- 3. Dywili S, Bonner A, Anderson J, O'Brien L. Experience of overseas-trained health professionals in rural and remote areas of destination countries: a literature review. Aust J Rural Health 2012; 20: 175-184.
- 4. Australian and New Zealand College of Anaesthetists. Reg 23: Recognition as a specialist in anaesthesia for international medical graduate specialists (IMGS) and admission to fellowship by assessment for IMGS. ANZCA, 2012. (accessed Mar 2012).
- 5. Strijbos J-W, Martens RL, Prins FJ, Jochems WMG. Content analysis: what are they talking about? Comput Educ 2006; 46: 29-48.
- 6. Brookfield SD, Preskill S. Discussion as a way of teaching: tools and techniques for democratic classrooms. 2nd ed. San Francisco, CA: Jossey-Bass, 2005.
- 7. Smith AE, Humphreys MS. Evaluation of unsupervised semantic mapping of natural language with Leximancer concept mapping. Behav Res Methods 2006; 38: 262-279.
- 8. Watson M, Smith A, Watter S. Leximancer concept mapping of patient case studies. Lect Notes Comput Sci 2005; 3683: 1232-1238.
- 9. Grech MR, Horberry T, Smith A. Human error in maritime operations: analyses of accident reports using the Leximancer tool. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 2002; 46: 1718-1721.
- 10. Landauer TK, Foltz PW, Laham D. An introduction to latent semantic analysis. Discourse Process 1998; 25: 259-284.
- 11. Australian and New Zealand College of Anaesthetists. Annual Report. Melbourne: ANZCA, 2006.
- 12. Australian and New Zealand College of Anaesthetists. Annual Report. Melbourne: ANZCA, 2008.
Publication of your online response is subject to the Medical Journal of Australia's editorial discretion. You will be notified by email within five working days should your response be accepted.