Connect
MJA
MJA

Building capacity in medical education research in Australia

Chris Roberts and Jennifer J Conn
Med J Aust 2009; 191 (1): 33-34. || doi: 10.5694/j.1326-5377.2009.tb02672.x
Published online: 6 July 2009

Medical education research has grown significantly as a discipline over the past decade. Evidence for this can be seen in the expansion in the number of peer-reviewed medical education journals, citations and international research conferences in this field. Standards for research have been articulated in initiatives such as the Best Evidence in Medical Education (BEME) Collaboration and the Campbell Collaboration, a subsidiary of the Cochrane Collaboration.1 Scientific evidence is increasingly being seen as the key driver of educational policy and practice, with medical educators looking to the literature to provide professional guidance on matters as diverse as curriculum design, instructional methods, simulation, assessment and professionalism, both in undergraduate and postgraduate settings. There is, however, a widespread perception in Australia as well as internationally that medical education research lacks methodological rigour, is not based on sound theoretical foundations, is failing to influence education and training systems and has been unsuccessful in attracting research grants to advance the discipline.1,2 Medical education researchers themselves have been among the harshest critics in this respect.

One of the fundamental issues for medical education research has been that it is perceived as being of poor quality when judged against biomedical research, where the randomised controlled trial (RCT) is seen as the scientific gold standard. The application of a biomedical paradigm to research in medical education, however, is limited by logistic and ethical constraints.3 Many educational interventions and reforms are highly complex and require long-term follow-up, and controlling confounders is a near impossible task. Control groups are difficult to construct, and are highly susceptible to contamination in the competitive environment in which medical students and trainees function. In addition, there are significant ethical problems with randomly assigning participants to interventions where outcomes are used to decide progression or selection.

Medical education research has more in common with social science research than with biomedical research.4 The field of social science research provides a range of research methods that can be used to rigorously describe and analyse complex phenomena, generate theoretical models, and frame strategic questions. Research methods in themselves are neither scientific nor unscientific; rather, it is the appropriate application of a method to a particular problem that defines scientific quality. Research endeavours in medical education typically include exploring factors that underpin effective learning and teaching processes, or those that promote changes in student behaviour. Applying a single research method, such as the RCT, would force researchers to adapt their enquiries to the method, leaving important research questions unanswered and medical education policy undeveloped.

Unfortunately, social science research methods, such as qualitative research, are often seen as the soft option,5 and there has been a perception that they can be carried out with little training.4 A good RCT is defined by a number of rigorous criteria; for example, a clear research question that is answered by the use of reliable and valid outcome measures and appropriate statistical analyses. Likewise, in qualitative methodology,6 analysis must be based on theoretical underpinnings derived from the literature, and have inbuilt quality assurance methods so that others could repeat the research and come up with similar findings.

A justifiable criticism of medical education research is that there has been an over-reliance on descriptive studies with easily measurable outcomes, such as learner satisfaction, or scoping studies where research is commissioned to consult peers about what they are doing or think they should do to solve a problem. This is not to say that these methods of research do not have their place — they are helpful for laying the foundation for new fields of enquiry. Their limitations, however, include an inability to measure hard endpoints such as behavioural change in learners, to clarify underlying mechanisms, or add to theory.7 Many studies can also be criticised on the basis of being small, of short duration, and opportunistic, rather than being driven by hypotheses. They are often based at single centres, which compromises the generalisability of the findings. This lack of a programmatic approach to medical education research means that the iterative research cycle of building on previous results often breaks down. Consequently, the results of many research endeavours are not being fed back into practice, and are failing to make an impact at the level of policy.

Many of the shortcomings that have beleaguered medical education research have been perpetuated by lack of funding.8 Most medical education research is unfunded, or heavily subsidised by universities, where the supply of money is often limited and unpredictable. Consequently, it is difficult to build up programmatic or complex research without external funding. In Australia, the source of medical education research funding lies somewhere between the Department of Health and Ageing and the Department of Education, Employment and Workplace Relations, with no particular centre of responsibility. There are as yet no medical education research panels on the National Health and Medical Research Council (NHMRC) or the Australian Research Council (ARC), the major sources of government funding for research, although the ARC is now using medical education experts to assess educational grant applications.

What are the enablers that will promote more rigorous and better funded medical education research in Australia? A key strategy must be the creation of a critical mass of trained researchers in the discipline.1 It has long been recognised that the development of scientific understanding in any field depends on a strong research community engaged in a complex combination of professional criticism and self-correction. Research capacity in medical education needs to be built by improving existing professional development programs, greater access to training in social science research methods, identifying and supporting researchers early in their careers, enhancing supervision of research training through higher degrees, supporting a diversity of entrants through targeted scholarships, and attracting the best people to work in Australia.

At a local level, university-based academic centres have been developed with the aspiration of competing with successful research centres such as those in Toronto, Chicago, San Francisco and Maastricht.9 However, medical educators located within traditional medical education units have had high service-provision burdens that have constrained any serious research output.10 Educational innovation needs to be based on sound theoretical foundations, with outcomes being fed back into policy and practice and into new research directions. As the relationship between research and teaching is synergistic, there is danger in academics being set up in ivory towers, separated from the realities of educational service delivery and clinical service provision.

Medical education research centres in Australia need to focus on building collaborative, externally funded programs of research, and to be supported by their universities to do so. There has been some early success with ARC grants, but this achievement needs to be built on with future applications that demonstrate need, good value, sound methodology and the track record of the chief investigators. Shifting the focus from learner satisfaction to performance-based measures and, more ambitiously, to clinical outcomes1,2 is likely to make submissions more attractive to funding bodies. Proposals are more likely to be successful if they reflect nationally or internationally agreed research priorities, involve programmatic research, and can integrate funding streams from both government and private bodies. Applications can also be strengthened by collaborating with other agencies to bring in discipline-based and methodological expertise. The ARC (for both Discovery and Linkage grant streams) and the Australian Teaching and Learning Council should expect to see a growth in well thought out research proposals from strategic partnerships of medical educators and other leaders in the field.

Finally, institutional leadership is required both at a federal and local level to develop capacity in medical education research. Medical Deans Australia and New Zealand have shown they can provide leadership by setting up the Medical Schools Outcomes Database project, which offers longitudinal student tracking into the workforce and beyond as a major source of research possibilities.11 They have also been active in setting a research agenda around procedures of selecting students into medical schools. The professional health educators’ body, the Australian and New Zealand Association of Medical Education (ANZAME: the Association for Health Professional Education), aims to promote research and encourage interprofessional interaction through its annual scientific meeting. There are concerns, however, that this meeting is being eclipsed by the Asia Pacific Medical Education Conference held in Singapore, and more needs to be done to attract leading local and international researchers to share their ideas within Australia.

  • Chris Roberts1
  • Jennifer J Conn2

  • 1 Office of Postgraduate Medical Education, University of Sydney, Sydney, NSW.
  • 2 Medical Education Unit, Faculty of Medicine, Dentistry and Health Sciences, University of Melbourne, Melbourne, VIC.



Competing interests:

None identified.

  • 1. Todres M, Stephenson A, Jones R. Medical education research remains the poor relation. BMJ 2007; 335: 333-335.
  • 2. Chen FM, Burstin H, Huntington J. The importance of clinical outcomes in medical education research. Med Educ 2005; 39: 350-351.
  • 3. Scriven M. A summative evaluation of RCT methodology: and an alternative approach to causal research. J MultiDisc Eval 2008; 5 (9): 11-24.
  • 4. Schuwirth LWT, van der Vleuten CPM. Challenges for educationalists. BMJ 2006; 333: 544-546.
  • 5. Berliner D. Educational research: the hardest science of all. Educat Researcher 2002; 31: 1-20.
  • 6. Colliver JA, McGaghie WC. The reputation of medical education research: quasi-experimentation and unresolved threats to validity. Teach Learn Med 2008; 20: 101-103.
  • 7. Cook DA, Bordage G, Schmidt HG. Description, justification and clarification: a framework for classifying the purposes of research in medical education. Med Educ 2008; 42: 128-133.
  • 8. Reed DA, Cook DA, Beckman TJ, et al. Association between funding and quality of published medical education research. JAMA 2007; 298: 1002-1009.
  • 9. Gruppen L. Creating and sustaining centres for medical education research and development. Med Educ 2008; 42: 121-122.
  • 10. Albert M, Hodges B, Regehr G. Research in medical education: balancing service and science. Adv Health Sci Educ Theory Pract 2007; 12: 103-115.
  • 11. Medical Deans Australia and New Zealand. Overview of Medical Deans Australia and New Zealand Medical Schools Outcomes Database (MSOD) and longitudinal tracking project. http://www.medicaldeans.org.au/msod_files/Document%20A_Overview%20MSOD.pdf (accessed May 2009).

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.