Preparedness for hospital practice among graduates of a problem-based, graduate-entry medical program

Sarah J Dean, Alexandra L Barratt, Graham D Hendry and Patricia M A Lyon
Med J Aust 2003; 178 (4): 163-166. || doi: 10.5694/j.1326-5377.2003.tb05132.x
Published online: 17 February 2003


Objective: To compare preparedness for hospital practice between graduates from a problem-based, graduate-entry medical program and those from other programs (undergraduate problem-based and traditional).

Design: Survey of graduates (by mailed questionnaire) and organisers of clinical training (by semistructured interview); results were compared with published results of surveys of graduates from other programs.

Setting and participants: All graduates of the first intake of the University of Sydney graduate-entry medical program were surveyed at the end of their first intern year (2001), along with the director of clinical training or intern manager at each of the New South Wales hospitals that employed the graduates.

Main outcome measures: Graduates' self-reported level of preparedness in the eight domains of the Preparation for Hospital Practice Questionnaire; and organisers' opinions of their strengths and weaknesses.

Results: 76 of 108 graduates from the graduate-entry program (70%) and organisers of clinical training at all 17 hospitals participated. Graduates from the program felt more prepared than did those from other programs in five of the eight domains assessed (interpersonal skills, confidence, collaboration, holistic care, and self-directed learning) and no less prepared in any domain. Organisers rated the graduates highly, especially in clinical competence, confidence, communication and professional skills. Opinions of interns' knowledge of basic sciences conflicted, with strengths and weaknesses mentioned with equal frequency.

Conclusion: Graduates from the graduate-entry, problem-based program are at least as well prepared for their intern year as graduates from traditional and undergraduate problem-based programs.

Several Australian medical schools have moved to graduate-entry problem-based programs, and others are considering changes to their curricula and entry criteria.1 Although the skills and characteristics of students enrolled in problem-based programs have been examined in detail,2-8 few studies have considered the effects of these curricula on graduates as interns. Existing studies suggest that graduates of problem-based medical programs feel themselves better prepared in many domains, including self-directed learning, than do graduates of traditional programs,1,9,10 but that they lag behind in understanding of science and the disease process.8,10

In 1997, the University of Sydney introduced a graduate-entry, problem-based medical program. The program is student-centred and emphasises development of life-long learning skills. The first cohort of students became interns in 2001. Towards the end of their intern year, we assessed their preparedness for hospital practice using the Preparation for Hospital Practice Questionnaire (PHPQ).1 This validated scale was developed to assess graduates of an undergraduate problem-based medical program at the University of Newcastle, New South Wales, and allows interns to self-report their preparedness across eight domains. We also interviewed the organisers of clinical training at the hospitals to which the interns were allocated about their preparedness.

Intern questionnaires

The PHPQ is a valid and reliable 41-item questionnaire, which contains eight subscales designed to assess key areas of medical hospital practice:

The PHPQ was given to hospitals for distribution to University of Sydney interns in September 2001. We asked the interns to submit the completed survey through an enclosed reply-paid envelope. Those who did not reply were sent a reminder letter, with another survey and reply-paid envelope. We also emailed a link to a web-based version of the questionnaire to all interns using their old, but still active, university email accounts.

Interviews with clinical training organisers

To collect qualitative data on the interns, two of us (S J D and G D H) conducted interviews between September and November 2001 with either the director of clinical training or, if unavailable, the manager responsible for the interns at each hospital to which the interns were allocated. Neither interviewer had a working relationship with any of these hospital staff, nor did the participants have responsibility for any workforce arrangements between the university and the hospitals. Interviews were 40–60 minutes, informal and semi-structured. They were conducted face-to-face at the metropolitan hospitals and by telephone at the three non-metropolitan hospitals.

Interviews began with an open-ended question asking organisers of clinical training for their perceptions of graduates from the University of Sydney graduate medical program compared to other interns employed at the hospital. Prompts were used to direct discussion into particular areas, such as the interns' basic science knowledge and how they coped within the hospital environment. Both interviewers were present at each interview and took comprehensive notes. The first two interviews were also audiotaped and transcribed, but, as the transcriptions matched the notes taken and generated little new information, other interviews were not audiotaped. After each interview, the interviewers compared notes for agreement and accuracy. To reduce bias, all interviews were conducted before interns' responses to questionnaires were compiled and analysed. Permission was obtained from the interviewees to publish de-identified comments for research purposes.

Data analysis

Results were compared with published data collected in 1995 using the PHPQ from interns who graduated from the five-year undergraduate problem-based program at the University of Newcastle or from traditional medical programs at the University of Sydney and the University of New South Wales.1

We analysed quantitative data using the Statistical Package for the Social Sciences, version 10.0 for Windows.11 Qualitative data were categorised independently by two researchers (G D H and S J D) and then compared for similarities and differences. P M A L also checked the data and categories to confirm agreement. Once the categories were established, the comments in each category were sorted into those representing strengths and weaknesses of the interns.


Of the 108 graduates, 76 completed the questionnaire (70% response rate). Median age of those who responded was 28 years (range, 25–42 years), and 59% were male. Their age and sex were representative of the total group of 108 (median age at enrolment into the four-year course, 23 years; range, 19–45 years; with 57% male).

Interviews were held with organisers of clinical training at all 17 hospitals to which University of Sydney interns had primary attachments. The director of clinical training was interviewed at 10 hospitals, and the intern manager at six, while both were present at the interview at one hospital.

Intern self-ratings

Self-reported preparedness for hospital practice among interns from the University of Sydney graduate-entry program is shown in Box 1, along with previously published results for interns who graduated from other programs.1

The Sydney graduates felt more prepared in five domains (interpersonal skills, confidence, collaboration, holistic care, and self-directed learning) than graduates from both the undergraduate problem-based learning course and more traditional courses. Scores for patient management and practical skills, understanding science and prevention were similar across all programs.

Clinical training organiser evaluations

Eight categories were generated from the comments of organisers of clinical training. Within each category, comments were grouped as strengths or weaknesses (Box 2). Clinical competence, confidence and communication and professional skills were the major strengths identified by the organisers of the Sydney graduates, while knowledge of anatomy and microbiology (ie, basic science knowledge) was the primary weakness. Overall, the impression of Sydney graduates gained from the training organisers was positive.


These data demonstrate that graduates from the University of Sydney graduate-entry course felt more prepared for hospital practice than did graduates from an undergraduate problem-based program and from traditional courses in five of the eight domains assessed. The qualitative data gained from interviews with organisers of clinical training was consistent with these self-perceptions.

These findings must be interpreted with caution, as the comparisons were not concurrent, and ratings of preparedness by graduates of the other programs may have changed since 1995. In addition, selection procedures differed between the programs, a confounding variable which might account for differences in self-reported outcomes.8 For example, University of Sydney students, being older at entry to the program, may be more mature or better communicators from the outset.9 We did not adjust for age, as we consider older age an inevitable feature of a graduate program. However, we note that the mean ratings reported for 1995 interns were adjusted for age and sex, although the authors reported, "The influence of age was not marked and there were no gender differences".1 Furthermore, where age was significant in their analysis, it was younger interns who had higher scores.

Our findings from interviews with the organisers of clinical training are congruent with overseas findings that supervisors consider graduates from problem-based medical programs to be better communicators than graduates from traditional programs.9 This may be partly explained by changed teaching methods and curricula, which include a focus on personal and professional development, designed to enhance and promote qualities such as teamwork and ethical practice. Also in a similar finding to ours, graduates from a traditional curriculum rated the adequacy of their knowledge base significantly higher than did graduates from a problem-based course.8 However, this study also found no significant differences between graduates' ratings for preparedness for postgraduate education.

In our study, comments from training organisers about the interns' basic science knowledge were conflicting. While half the organisers mentioned knowledge of basic mechanisms as a strength, the other half commented that this knowledge was weak. Conflicting data such as these may have arisen because of the small number of interns from the University of Sydney at any one hospital (maximum, eight to 10).

This study has a number of other limitations. Participants were the first graduating cohort, with high levels of enthusiasm. The information gained from organisers of clinical training may have been enhanced had the interviews been more structured. No weaknesses were recorded for some categories, and it is not known how the context of the interviews may have influenced the results. However, we found the organisers frank in their responses, with some admitting to earlier doubts about what to expect from the new graduates. Feedback from other stakeholders about the performance of graduates (eg, patients, other healthcare personnel and colleagues) would have increased the validity of the findings. Finally, we acknowledge the biases inherent in the qualitative aspect of this study.

We are continuing to collect data to evaluate the outcome of the new graduate-entry program. Because of the feedback from the first cohort of graduates, the basic sciences now receive more attention in the curriculum, and future evaluations will assess the impact of this on the preparedness of graduates.

We cautiously conclude that the combination of graduate entry and the new teaching and learning environment has contributed to the interns' performance, particularly their personal and professional skills (communication, confidence, collaboration, holistic care and self-directed learning). However, differences in outcomes may be more or less apparent at later stages in the careers of these interns.10 It is therefore crucial to continue measuring the long-term effects of curriculum reform.

2: Summary of comments from directors of clinical training about interns from the University of Sydney graduate-entry (GE) program


Distribution of comments

Examples of strengths

Examples of weaknesses

Clinical competence

Overall strong (no weaknesses mentioned)

All Sydney GE graduates are extremely well prepared in terms of their clinical examination skills, etc. They are well and truly in the top third of all interns; they are above average in their clinical ability.

None mentioned.


Overall strong (no weaknesses mentioned)

Are quite happy to upgrade their duties without feeling threatened. Have greater knowledge and confidence to put this knowledge into practice, whereas a lot of the other interns are often too hesitant and apprehensive about doing this.

None mentioned.

Communication skills

Overall strong (no weaknesses mentioned)

Communication with other staff members is the most noticeable strength.

None mentioned.

Professional skills (eg, time management)

Overall strong (no weaknesses mentioned)

Have an inherent understanding of system requirements (eg, overtime). They see their responsibilities as interns as an important part of their job, they do not feel "precious", and realise that they are at the beginning of their careers.

None mentioned.


Moderately strong (few weaknesses mentioned)

Are much more mature than other graduates, not only in terms of other degrees, but other work they have done, as well as being more used to the realities of working with people.

Can be very mature, but some situations can create antagonisms as to who is boss.

Teamwork skills

Moderately strong (few weaknesses mentioned)

"Pleasure to work with, excellent team player." Their ability to get on with people, to accept all types of personalities, has been good.

Do not get along well with the other interns; some of them are a bit individual and not good team players.

Basic science knowledge

Balanced (strengths and weaknesses mentioned with equal frequency)

Basic science knowledge is consistent and better than expected across all terms. They are advanced in their basic science knowledge compared with other graduates.

Are lacking in the core knowledge of microbiology and anatomy. There are gaps in their basic science knowledge, but they may not necessarily impact on their ability to become good doctors.

Contribution to work environment

Balanced (strengths and weaknesses mentioned with equal frequency)

Sydney GE graduates are more willing to give their ideas about new initiatives or ways to implement changes, whereas undergraduates are not as proactive until they become resident medical officers.

Have unreasonably high expectations about rostering and the working environment.

  • Sarah J Dean1
  • Alexandra L Barratt2
  • Graham D Hendry3
  • Patricia M A Lyon4

  • University of Sydney, Sydney, NSW.



We are grateful to Dr Isobel Rolfe and Dr Sallie-Anne Pearson (University of Newcastle, NSW) for use of their questionnaire, and also thank Associate Professor Jill Gordon (University of Sydney) and the three anonymous reviewers for helpful suggestions. We also thank the interns and organisers of clinical training for their cooperation.

Competing interests:

All authors are members of the Faculty of Medicine, University of Sydney, NSW.

  • 1. Hill J, Rolfe I, Pearson SA, Heathcote A. Do junior doctors feel they are prepared for hospital practice? A study of graduates from traditional and non-traditional medical schools. Med Educ 1998; 32: 19-24.
  • 2. Shin JH, Haynes RB, Johnston ME. Effect of problem-based, self-directed undergraduate education on life-long learning. Can Med Assoc J 1993; 148: 969-976.
  • 3. Peterson M. Skills to enhance problem-based learning. Med educ online [serial online] 1997.; 2(3). Available at (accessed Apr 2002).
  • 4. Norman GR, Schmidt HG. The psychological basis of problem-based learning: A review of the evidence. Acad Med 1992; 67: 557-565.
  • 5. Schmidt HG. Problem-based learning: does it prepare medical students to become better doctors? Med J Aust 1998; 168: 429-430.
  • 6. Finucane PM, Johnson SM, Prideaux DJ. Problem-based learning: its rationale and efficacy. Med J Aust 1998; 168: 445-448.
  • 7. Albanese M. Problem-based learning: why curricula are likely to show little effect on knowledge and clinical skills. Med Educ 2000; 34: 729-738.
  • 8. Mann KV, Kaufman DM. A comparative study of problem-based and conventional undergraduate curricula in preparing students for graduate medical education. Acad Med 1999; 74 (10 Suppl): S4-S6.
  • 9. Schmidt HG, van der Molen HT. Self-reported competency ratings of graduates of a problem-based medical curriculum. Acad Med 2001; 76: 466-468.
  • 10. Jones A, McArdle PJ, O'Neill PA. Perceptions of how well graduates are prepared for the role of pre-registration house officer: A comparison of outcomes from a traditional and integrated PBL curriculum. Med Educ 2002; 36: 16-25.
  • 11. SPSS for Windows [computer program]. Version 11.0.0. Chicago (IL): SPSS Inc, 2001.


remove_circle_outline Delete Author
add_circle_outline Add Author

Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.