Quality in qualitative research

Simon C Kitto, Janice Chesters and Carol Grbich
Med J Aust 2008; 188 (4): 243-246. || doi: 10.5694/j.1326-5377.2008.tb01595.x
Published online: 18 February 2008

Research in health care, especially clinical medicine, is an increasingly complex field that ranges from small-scale, cutting-edge benchtop science to large-scale population studies. Recently, there has been an increasing recognition by evidence-based clinical researchers that it is important to look towards qualitative research. In addition, an expanding interest in examining the attitudes, beliefs and experiences of those involved or affected by the delivery of health care have brought qualitative research into the spotlight.1 The increased interest in this form of research has led to concerns by readers and reviewers about the assessment of quality.2-10 In this article, we provide definitions and standards for qualitative research that can be used for assessment. We expect that authors who submit work for publication that adheres to the suggestions in this article will increase the likelihood of acceptance for publication and, more importantly, will enhance the transferability of findings into policy and practice in the future.

Qualitative research

Most commonly, qualitative research is concerned with the systematic collection, ordering, description and interpretation of textual data generated from talk, observation or documentation. Qualitative research methods include the techniques of interviewing, observation, and document analysis. Its goal is to explore the behaviour, processes of interaction, and the meanings, values and experiences of purposefully sampled individuals and groups in their “natural” context.11-14 The capacity to make conceptual generalisations from the local context of a qualitative study to other settings is the desired outcome.

In contrast, most quantitative research is concerned with measuring the magnitude, size or extent of a phenomenon. Data collection is by formal rules of procedure and verification, analysis is through the use of standardised statistical formats, and prediction and empirical generalisation are the desired outcomes.15,16 The conventional methodological criteria of quantitative research — validity, reliability and empirical generalisability — are generally not directly applied to qualitative research because of the different frameworks, sampling approaches, size of sample and goals of qualitative research. Instead, terms such as rigour (thoroughness and appropriateness of the use of research methods), credibility (meaningful, well presented findings) and relevance (utility of findings) are used to judge the quality or “trustworthiness” of a study (see Box 1 for definitions of some common terms in qualitative research).17

MJA guidelines for assessing qualitative research

There has been a strong move towards standardising research reporting in quantitative research, as can be seen through the development of standard reporting mechanisms like the CONSORT (Consolidated Standards of Reporting Trials) statement.18 However, some question the appropriateness of using checklists to assess qualitative research because of the diversity of approaches in collecting, analysing and interpreting qualitative data.4,19 Furthermore, the sheer number of checklists can prove overwhelming and confusing for new researchers. A recent review of criteria for assessing interview and focus group studies revealed more than 22 checklists actively being used, and subsequently resulted in the formulation of a new 32-item guideline.20

Nevertheless, it is possible to develop clear and useful generic guidelines for assessing and presenting qualitative research.2,7,17,21 Box 2 outlines some criteria that have been constructed with the MJA readership in mind that can be used to assess and enhance the rigour of qualitative studies. In themselves, these criteria do not ensure rigour. However, they can strengthen rigour if they are used in concordance with a broader understanding of qualitative research design, data collection and analysis.19

Procedural rigour

Procedural, or methodological, rigour concerns the transparency or “explicitness” of the description of the way the research was conducted. It involves detailing issues of accessing subjects; development of rapport and trust; how data are collected, recorded, coded and analysed; and accounts of the manner in which errors or subject refusals are dealt with.4,11,22 In this regard, readers and reviewers may ask the following questions while examining descriptions of qualitative methods: How were participants/settings accessed? Who was interviewed/observed? How often? For how long? What interview questions were asked? What was the purpose of any observation? Which policy documents/case notes were accessed? How were they assessed? How was collected data managed?

Interpretative rigour

Interpretative rigour relates to as full as possible a demonstration of the data/evidence. In qualitative research, a commonly used concept is inter-rater reliability. This refers to using a type of researcher triangulation by which multiple researchers are involved in the analytical process. This is an attempt to increase the validity and reliability of the study19 through the provision of a more complex and nuanced understanding of the possible interpretations of the objects of the research.11 In contrast to the quantitative research paradigm, what is important in this process is not the level of consensus, but the opportunity for discussion among analysts to provide opportunities for developing further coding.19

A related technique is that of respondent validation, or member checking. This entails offering subjects interviewed the opportunity to view and amend their transcripts as a type of validity.12 However, this approach does have limitations due to the evolution over time of the positions and purposes of the researchers and participants, thereby potentially affecting interpretations and accounts. Respondent validation should be thought of as part of a process of reducing error, which involves the generation of further original data, which then requires interpretation.8

Other techniques that enhance interpretative rigour are the differing forms of triangulation: data (multiple evidentiary sources; ie, documents, interviews, survey data, observation), methods (multiple methods), and theory (multiple theoretical and conceptual frames applied to the research to enhance insights into phenomena). Using these forms of triangulation allows the development of a comprehensive understanding of the phenomena and can ameliorate the potential bias of simply using one method.4,5,8,11,16,22

In the interpretive process, accounts of “negative” or “deviant” cases are especially important. These are explanations pertaining to data or evidence that contradicts the researchers overall explanatory account of the phenomena.5

In sum, a clear description of what forms of analysis were used, the process and what were the major outcomes of the analytical process in terms of findings is needed to ensure quality for the author, and to enable an assessment to be made in terms of the analytical quality of the research by the reader.

Reflexivity and evaluative rigour

Reflexivity is where researchers openly acknowledge and address the influence that the relationship among the researchers, the research topic and subjects may have on the results.4,11,13 Fundamentally, reflexivity requires a demonstration by the researchers that they are aware of the sociocultural position they inhabit and how their value systems might affect the selection of the research problem, research design, collection and analysis of data.15 It also refers to an awareness by the researchers of the social setting of the research and of the wider social context in which it is placed.4

Evaluative rigour refers to ensuring that the ethical and political aspects of research are addressed. Typically, this refers to proper ethics approval from appropriate committees covering confidentiality, informed consent and steps to avoid possible adverse effects on the subjects. Importantly, where appropriate, relevant community leaders should be consulted in the design and conduct of the research.11 Researchers should revisit their actions and interactions within the research process to ensure as “accurate” as possible portrayal of the production of their findings.


Conceptual generalisability and transferability refer to how well the study’s findings inform health care contexts that differ from that in which the original study was undertaken.4 For example, a review of data from qualitative studies was conducted on a wide variety of doctor–patient interactions about medication compliance.24 The authors examined barriers to patients taking prescribed medication as directed by their doctors and found that patients were often inclined to resist taking medicines, not because of problems with the patients, doctors or systems, but because patients were concerned about the medicines. This type of study allows for the construction and transfer of general policy on medicine-taking (through, for example, less emphasis on patient behaviour modification and more emphasis on production of safer medicines) and practice (suggesting, for instance, that doctors should assist lay evaluations through provision of more information, support, feedback and safe prescribing practices).

The parameters for presenting qualitative research in the MJA

The formats of biomedical journals such as the Medical Journal of Australia and BMJ provide particular challenges for comprehensive analysis and reporting of qualitative studies.4,11,22 The house style of biomedical journals involves tight restrictions on word length and particular forms of presentation. For instance, lengthy ethnographic studies of health issues can require 6000–8000 words to successfully elucidate the phenomena.25 Obviously, this is not tenable within most biomedical journals, where article lengths of 2000–3000 words are more common. These raise significant limitations to an exhaustive analysis of the volume of data that is commonly produced within qualitative studies.5

When writing for the MJA, we suggest that qualitative authors focus on only a couple of aspects of their research findings and use visual displays strategically (ie, charts, quotes and tables).22 We suggest that authors avoid an overabundance of simplified descriptions of multiple aspects of the phenomena under study. This detracts from a focused, transparent and considered analysis of the core issues relevant to the objectives of the research.


We have set out some useful general rules that, if followed, will allow for concise and informative assessment of qualitative findings. These assessment criteria will be particularly relevant to articles that use the most common qualitative data collection techniques: interviews, focus groups, document analysis, or observation techniques (alone or in combination). We suggest that potential MJA authors also consider using the article format and distribution of word count presented in Box 3 to maximise the rigour and clarity of their research articles. Using this outline should allow the authors to provide the reviewers and readers with enough information on each aspect of the research to engender a sense of research rigour and the trustworthiness of research findings.

  • Simon C Kitto1
  • Janice Chesters1
  • Carol Grbich2

  • 1 School of Rural Health, Monash University, Moe, VIC.
  • 2 Flinders University, Adelaide, SA.

Competing interests:

None identified.

  • 1. Green J, Britten N. Qualitative research and evidence based medicine. BMJ 1998; 316: 1230-1232.
  • 2. Boulton M, Fitzpatrick R, Swinburn C. Qualitative research in health care: II. A structured review and evaluation of studies. J Eval Clin Pract 1996; 2: 171-179.
  • 3. Daly J, Willis K, Small R, et al. A hierarchy of evidence for assessing qualitative health research. J Clin Epidemiol 2007; 60: 43-49.
  • 4. Green J, Thorogood N. Qualitative methods for health research. London: Sage, 2004.
  • 5. Mays N, Pope C. Qualitative research: rigour and qualitative research. BMJ 1995; 311: 109-112.
  • 6. Mays N, Pope C. Qualitative research: observational methods in health care settings. BMJ 1995; 311: 182-184.
  • 7. Mays N, Pope C. Quality in qualitative health research. In: Pope C, Mays N, editors. Qualitative research in health care. London: BMJ Books, 1999: 89-101.
  • 8. Mays N, Pope C. Qualitative research in health care: assessing quality in qualitative research. BMJ 2000; 320: 50-52.
  • 9. Pope C, Mays N. Qualitative research: reaching the parts other methods cannot reach: an introduction to qualitative methods in health and health services research. BMJ 1995; 311: 42-45.
  • 10. Pope C, Ziebland S, Mays N. Qualitative research in health care: analysing qualitative data. BMJ 2000; 320: 114-116.
  • 11. Liamputtong P, Ezzy D. Qualitative research methods. Melbourne: Oxford University Press, 2005.
  • 12. Lincoln YS, Guba EG. Naturalistic inquiry. Newbury Park, Calif: Sage, 1985.
  • 13. Malterud K. Qualitative research: standards, challenges, and guidelines. Lancet 2001; 358: 483-488.
  • 14. Miles MB, Huberman AM. Qualitative data analysis: an expanded sourcebook. 2nd ed. Thousand Oaks, Calif: Sage, 1994.
  • 15. Grbich C. Qualitative research in health: an introduction. London: Sage, 1999.
  • 16. Grbich C. Qualitative data analysis: an introduction. London: Sage, 2007.
  • 17. Critical Appraisal Skills Programme (CASP). 10 questions to help you make sense of qualitative research. Oxford: Public Health Resource Unit, 2006. Tool.pdf (accessed Oct 2007).
  • 18. Moher D, Schulz K, Altman D. The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomized trials. JAMA 2001; 285: 1987-1991.
  • 19. Barbour RS. Checklists for improving rigour in qualitative research: a case of the tail wagging the dog? BMJ 2001; 322: 1115-1117.
  • 20. Tong A, Sainsbury P, Craig JC. Critical appraisal criteria for interviews and focus groups. In: Qualitative Research in Evidence-based Healthcare — an Exploration of Scope and Methods. 2006; Jul 10–11; Adelaide.
  • 21. Blaxter M. Criteria for qualitative research. Med Sociol News 2000; 26: 34-37.
  • 22. Hansen EC. Successful qualitative health research: a practical introduction. Sydney: Allen and Unwin, 2006.
  • 23. Rice PL, Ezzy D. Qualitative research methods: a health focus. Melbourne: Oxford University Press, 1999.
  • 24. Pound P, Britten N, Morgan M, et al. Resisting medicines: a synthesis of qualitative studies of medicine taking. Soc Sci Med 2005; 61: 133-155.
  • 25. Moore D. Governing street-based injecting drug users: a critique of heroin overdose prevention in Australia. Soc Sci Med 2004; 59: 1547-1557.


remove_circle_outline Delete Author
add_circle_outline Add Author

Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.