Connect
MJA
MJA

Development and pilot study of the Primary Care Practice Improvement Tool (PC-PIT): an innovative approach

Lisa Crossland, Tina Janamian, Mary Sheehan, Victor Siskind, Julie Hepworth and Claire L Jackson
Med J Aust 2014; 201 (3): S52-S55. || doi: 10.5694/mja14.00262
Published online: 21 July 2014

Abstract

Objective: To assess the usability and validity of the Primary Care Practice Improvement Tool (PC-PIT), a practice performance improvement tool based on 13 key elements identified by a systematic review. It was co-created with a range of partners and designed specifically for primary health care.

Design: This pilot study examined the PC-PIT using a formative assessment framework and mixed-methods research design.

Setting and participants: Six high-functioning general practices in Queensland, Australia, between February and July 2013. A total of 28 staff participated — 10 general practitioners, six practice or community nurses, 12 administrators (four practice managers; one business manager and eight reception or general administrative staff).

Main outcome measures: Readability, content validity and staff perceptions of the PC-PIT.

Results: The PC-PIT offers an appropriate and acceptable approach to internal quality improvement in general practice. Quantitative assessment scores and qualitative data from all staff identified two areas in which the PC-PIT required modification: a reduction in the indicative reading age, and simplification of governance-related terms and concepts.

Conclusion: The PC-PIT provides an innovative approach to address the complexity of organisational improvement in general practice and primary health care. This initial validation will be used to develop a suite of supporting, high-quality and free-to-access resources to enhance the use of the PC-PIT in general practice. Based on these findings, a national trial is now underway.

There has been increased focus on the importance of primary health care across Australia, the United States and the United Kingdom.1-4 Australia's first National Primary Health Care Strategy was developed in 2010, with a refocus on the importance of quality primary health care via a range of approaches, including regional integration and a focus on information and technology including e-health; ensuring a skilled workforce; improving infrastructure; and a focus on financing and system performance.4 It also heralded the development of new indicators, including performance improvement; teamwork; patient-centric approaches to care; an emphasis on quality and safety benchmarking; a combined approach to organisational and clinical governance and strategies for change management.4

While much has been written about the development and implementation of approaches to quality improvement in tertiary care or approaches adapted to primary health care, there is a paucity of research on approaches designed specifically for primary health care settings.

There is currently no single tool available to general practices combining the traditional areas of practice organisation (eg, clinical governance and the use of information technology) with more contemporary and, as yet, less widely used elements (eg, change management and leadership) in an internally facilitated approach. Consequently, there is a need for quality improvement tools and approaches “bespoke” to primary health care; sensitive to the wide variations in primary health care contexts; which take a whole-of-practice approach. These can be based on organisational assessment approaches that are sensitive to, and inclusive of, the concerns of clinical management.5-7

In a systematic review published in this supplement, we identified 13 key elements integral to high-performing practices.8 Building on this research, we developed the Primary Care Practice Improvement Tool (PC-PIT).

The PC-PIT seeks to link tangible aspects of quality improvement (eg, the presence of defined processes of care, formalised meetings, data collection and review) and less tangible domains (eg, communication, change management and the creation of a culture of performance). Although it was not designed to replace existing accreditation processes, it can highlight specific areas across clinical and organisational cultural aspects in which a practice may wish to focus its improvement efforts.

The PC-PIT aims to (1) be adaptable to variable and dynamic individual service settings; (2) address clinical governance and the impact of organisational management as part of an ongoing quality improvement cycle; (3) be led by practice managers as an internal quality improvement process based on a whole-of-practice approach; and (4) be delivered online and at low cost.

In this article, we describe the development of the PC-PIT and subsequent validation in a pilot study. The pilot study had three objectives: to determine the readability of the PC-PIT; to establish its content validity; and to explore staff perceptions of its use in general practice.

Methods

Development of the PC-PIT

Based on the findings of our systematic review (Box 1),8 the PC-PIT was co-created18 using cyclical feedback from a range of key national stakeholders, including the Australian Association of Practice Managers (AAPM), the Royal Australian College of General Practitioners (RACGP), the Improvement Foundation (Australia), the Australian Primary Health Care Nurses Association and the Australian Commission on Safety and Quality in Health Care. This process provided ongoing input into the elements used, and the structure and mode of delivery of the tool.

The PC-PIT was designed as an online instrument that could be completed confidentially by practice staff, facilitated by the practice manager. Staff receive a personal link to the online tool, provide basic demographic information, and rate their perception of each of the 13 elements on a five-point Likert scale. To maintain the privacy and confidentiality of the respondents, a report is generated from the completed tools. The report provides average scores for each of the 13 elements for the entire practice. Results are presented to the practice manager in a spider diagram, separating the lower scoring elements (those that score an average of 1–3) from those that scored higher (an average of 4–5). Lower scoring elements indicate where the practice may wish to make some improvement. If the practice has more than 15 staff members, practice managers can request aggregated scores for each employee group (administration; management; clinical; allied health) and differentiate scores provided by permanent versus contract staff.

After reviewing their practice report and discussing the scores with practice staff, practice managers then undertake the Plan–Do–Study–Act cycle. This approach guides practice managers through a process to identify area(s) for improvement, plan and monitor strategies to achieve the improvement, identify staff to undertake the improvement, and identify clear indicators to measure when the improvement has been achieved.

Pilot study

The pilot study was based on a formative assessment framework and mixed-methods research design. Assessment of the PC-PIT was conducted with a purposive sample of six general practices in Brisbane, Australia. Critical case sampling19 was used to select the practices from which the most detailed, and information-rich data could be obtained on this topic because all practices had experience in the use of quality improvement processes and their integration into the general practice setting.

Each practice provided detailed quantitative and qualitative assessment. In addition, two practice managers were experienced practice accreditation assessors.

All staff across the six practices were provided with a personal link to access the PC-PIT and asked to complete it within a week. They were also provided with a hard copy questionnaire, which elicited quantitative and qualitative data.

The study was conducted between 1 February and 30 July 2013. Ethics approval was granted by the University of Queensland Ethics Committee.

Quantitative data collection

Readability was assessed using the Flesch–Kincaid Readability Formula and Gunning–Fog Index in a combined online test.20 The Flesch–Kincaid grade level indicates a reading age based on the US education reading assessment system. The Gunning–Fog Index score is based on the number of words, and additional complex words (words containing three or more syllables) in the selected text. A limitation of the Index is that not all complex words are difficult to understand.

Participants completed a series of Likert scales that specifically asked for ratings of the following content:

  • Readability. How easy was it to understand the PC-PIT, were there any words or phrases you were unfamiliar with; were there any words or phrases you were unsure of?
  • Content validity. Relevance to general practice; relevance to the role and position of practice staff. Wording and understanding: where did you get stuck; why did you get stuck (layout versus content); what does this element mean to you/how would you describe this element?
  • Process validity. Usability of the tool — ease of use online; layout of questions; problems or issues completing the online PC-PIT; suggested changes to layout and process of completion.

Qualitative data collection

Participants were asked to respond to a series of open-ended questions about their experiences of completing the PC-PIT and their perceptions of the relevance and usefulness of the tool to general practice. Semi-structured interviews were conducted with practice managers to gain feedback on perceptions of the content of the PC-PIT, its usefulness as a primary health care improvement tool and the process of using the PC-PIT.

Likert scale data for each practice were analysed using Microsoft Excel 2010 (Microsoft Corporation) to enable basic descriptive statistics (frequencies). Semi-structured interviews and open-ended qualitative data were transcribed and imported into NVivo 10 (QSR International). An inductive thematic analysis was undertaken to identify common themes.21

Results

Six practices were enrolled in the pilot. Four practices completed the pilot and provided complete datasets. Two practices did not complete all data collection due to staff absences or building renovations during the study period. These practices were not included in the final data analysis. In total, 28 staff completed the pilot — 10 GPs, six practice or community nurses and 12 administrators (including four practice managers; one business manager and eight reception or general administrative staff).

Readability of the PC-PIT

The PC-PIT required a high reading age of greater than 20 years. The definitions of organisational and clinical governance, along with those related to information and information technology, were scored as highly complex text. These scores were consistent with the qualitative feedback from several administrative practice staff members, who assessed these element descriptions as difficult to understand. Appendix 1 provides a comparison of the readability scores for each of the PC-PIT elements and their corresponding definitions.

Many administrative and reception staff (with the exception of practice managers) found elements of the PC-PIT difficult to understand. Two GPs also provided low ratings (1–3) for the element relating to education and training. This was due to confusion about how the element of education and training related to requirements for continuing professional development available to GPs in practice. Details are shown in Appendix 2.

Two key difficulties were identified by staff in the qualitative feedback: complicated wording and difficulties understanding the terms governance and performance (Box 2). A range of staff (nurses, allied health professionals and administrative staff) felt that the wording of the PC-PIT definitions were long and complicated. Nursing staff also made suggestions to change the term governance to management, to clarify the meaning for all staff.

Acceptability and relevance of the PC-PIT to general practice

Nineteen of the 28 participants rated the PC-PIT as a useful tool for assessing key elements of practice organisation and function. Participants emphasised the relevance of the PC-PIT to everyday practice work and planning, and the role of the PC-PIT in allowing all staff to be involved in the identification of areas for improvement (Box 3).The PC-PIT was seen as highly usable, with 21 participants rating it as easy and preferable to complete online than a paper-based form.

However, eight participants did not think it would be usable as a future assessment tool based on its current format. This group included administrative staff, who generally found the PC-PIT elements difficult understand, and two GPs, who perceived that the PC-PIT covered areas that were predominantly outside clinical management processes. Four participants did not respond to the question.

Discussion

Overall, participants assessed the PC-PIT as an acceptable and easy-to-complete quality improvement tool that offers a new approach to improving practice performance in areas that are not routinely addressed. Although inclusion of all staff was noted as particularly useful, some sectors of senior administration felt that not all staff needed to be aware of areas of practice function, such as performance measures and aspects of organisational and clinical governance. Such attitudes may compromise the future effective use of the PC-PIT, particularly as many staff roles and responsibilities are highly interdependent, and improved teamwork is based on an understanding and appreciation of the complementary aspects of these roles.

Based on findings from the pilot, we have reworded and simplified PC-PIT element definitions. Based on the attitudes expressed by business managers, an online prerecorded presentation provides an introduction to the purpose of the PC-PIT and the whole-of-practice approach. This approach aims to clarify the purpose of the PC-PIT and ensure all staff have a basic understanding of how the elements relate to practice performance and their own roles and responsibilities.

A national trial of the PC-PIT is now underway. The trial will assess process and construct validity with a range of general practices and primary health care business models in different settings, including small private practices and large corporate service models across Australia. It seeks further and more in-depth information about the means by which individual staff members perceive and score the 13 elements of the PC-PIT, the utility and effectiveness of the tool across the range of practice models and the role of context (eg, inner urban, regional and rural settings) in the adaption and use of the tool. Participation in the trial allows staff to gain quality improvement and continuing professional development points under the RACGP 2014–2016 triennium and points under the AAPM professional development program.

The PC-PIT provides an innovative approach to address the complexity of organisational improvement in general practice and primary health care, matched to key elements identified in international evidence as being integral to high-functioning practices. Our study indicates that the PC-PIT has content and process validity offering a comprehensive, practice-led approach to organisational aspects of continuous quality improvement. Further research to refine the tool has now commenced with primary health care services across Australia.

1 Elements and sub-elements of the Primary Care Practice Improvement Tool

Element

Element and sub-element description


Patient-centred and community-focused care

A patient-centric approach to care, as drawn from the patient-centred medical approach.9,10

Leadership

Definition taken from aspects of leadership in primary health care. Encompasses clinical and organisational leadership, and includes staff who may be involved in leading aspects of change or improvement.11

Governance

Divided into two sub-elements: organisational and clinical governance. Organisational governance comprises non-clinical factors that contribute to the practice's performance.12 Clinical governance relates to processes to manage clinical care and maintain patient safety.

Communication

Incorporates aspects of the integration of care identified previously,13 and includes three sub-elements: team-based care, availability of information for patients, and availability of information for staff.

Change management

Incorporates three attributes of organisational change management and sustainable change:14,15 namely readiness for change, education and training, and incentives.

Performance

Incorporates two sub-elements identified in previous assessment tools:16,17 process improvement and performance results.

Information and information technology

Relates to internal software and data management tools used by practice staff (clinical and non-clinical); their “fitness for purpose” and ease of use. It includes the electronic systems by which information is shared with other key external services.

Contextual practice information

Relevant information includes staff role, length of time in role, length of time in primary health care, practice mission or vision statement.


Provenance: Commissioned; externally peer reviewed.

  • Lisa Crossland1
  • Tina Janamian1
  • Mary Sheehan2
  • Victor Siskind2
  • Julie Hepworth3
  • Claire L Jackson4,5

  • 1 Centre of Research Excellence in Primary Health Care Microsystems, University of Queensland, Brisbane, QLD.
  • 2 Psychology and Counselling, Queensland University of Technology, Brisbane, QLD.
  • 3 Public Health and Social Work, Queensland University of Technology, Brisbane, QLD.
  • 4 Discipline of General Practice, University of Queensland, Brisbane, QLD.
  • 5 Centres for Primary Care Reform Research Excellence, University of Queensland, Brisbane, QLD.


Correspondence: l.crossland1@uq.edu.au

Acknowledgements: 

We gratefully acknowledge Australian Primary Health Care Research Institute funding. We also give special thanks to all the practice staff who participated in this pilot study.

Competing interests:

No relevant disclosures.

  • 1. World Health Organization. The World Health Report 2008: primary health care — now more than ever. Geneva: WHO, 2008.
  • 2. NHS Executive, UK. Developing NHS purchasing and GP fundholding: towards a primary care-led NHS. EL (94) 79. Leeds: Department of Health, 1994.
  • 3. Davis K, Abrams M, Stremikis K. How the Affordable Care Act will strengthen the nation's primary care foundation. J Gen Intern Med 2011; 26: 1201-1203.
  • 4. Australian Government Department of Health and Ageing. Building a 21st century primary health care system: Australia's first National Primary Health Care Strategy. Canberra: DoHA, 2010.
  • 5. Mannion R, Konteh F, McMurray R, et al. Measuring and assessing organisational culture in the NHS. Report to the National Institute for Health Service Delivery and Organisation Programme, UK, 2008.
  • 6. Brennan SE, Bosch M, Buchan H, Green SE. Measuring organizational and individual factors thought to influence the success of quality improvement in primary care: a systematic review of instruments. Implement Sci 2012; 7: 121.
  • 7. Rhydderch M, Elwyn G, Marshall M, Grol R. Organisational change theory and the use of indicators in general practice. Qual Saf Health Care 2004; 13: 213-217.
  • 8. Crossland L, Janamian T, Jackson CL. Key elements of high-quality practice organisation in primary health care: a systematic review. Med J Aust 2014; 201 (3 Suppl): S47-S51. <MJA full text>
  • 9. Nabitz U, Klazinga N, Walburg J. The EFQM excellence model: European and Dutch experiences with the EFQM approach to health care. Int J Qual Health Care 2000; 12: 191-201.
  • 10. Stange KC, Nutting PA, Miller WL, et al. Defining and measuring the patient-centered medical home. J Gen Intern Med 2010; 25: 601-612.
  • 11. Slavkin H. Leadership for health care in the 21st century: a personal perspective. J Healthc Leadersh 2010; 2: 35-41.
  • 12. Engels Y, Campbell S, Dautzenberg M, et al. Developing a framework of, and quality indicators for, general practice management in Europe. Fam Pract 2005; 22: 215-222.
  • 13. Jackson C, Nicholson C. Making integrated healthcare delivery happen – a framework for success. APJHM 2008; 3: 19-24.
  • 14. Lehman WE, Greener JM, Simpson DD. Assessing organizational readiness for change. J Subst Abuse Treat 2002; 22: 197-209.
  • 15. Mannion R, Konteh FH, Davies HTO. Assessing organisational culture for quality and safety improvement: a national survey of tools. Qual Saf Health Care 2009; 18: 153-156.
  • 16. DeJong DJ. Quality improvement using the Baldrige Criteria for Organizational Performance Excellence. Am J Health Syst Pharm 2009; 66: 1031-1034.
  • 17. Nelson E, Batalden P, Godfrey M, editors. Quality by design: a clinical microsystems approach. San Francisco: Jossey-Bass, 2007.
  • 18. Ramaswamy V, Gouillart F. Build the co-creative enterprise. Harv Bus Rev 2010; 88: 100-109, 150.
  • 19. Patton, MQ. Qualitative research and evaluation methods. 3rd edition. Thousand Oaks, Calif: Sage Publications, 2002.
  • 20. Readability calculator [website]. http://www.online-utility.org/english/readability_test_and_improve.jsp (accessed Jun 2014).
  • 21. Pope C, Ziebland S, Mays N. Qualitative research in health care. Analysing qualitative data. BMJ 2000; 320: 114-116.

Author

remove_circle_outline Delete Author
add_circle_outline Add Author

Comment
Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.