Is there value in the Relative Value Study? Caution before Australian Medicare reform

Michael Wright
Med J Aust 2015; 203 (8): 331-333. || doi: 10.5694/mja15.00571
Published online: 19 October 2015


  • The federal government has announced formation of a Medicare Benefits Schedule (MBS) Review Taskforce, described as the most comprehensive review of the MBS in its near 50-year history.
  • The Relative Value Study (RVS) was the last major collaborative attempt between the government and the medical profession to restructure the MBS.
  • The RVS was a review of services and fees of the MBS conducted between 1994 and 2001 that was never implemented.
  • This article provides a historical narrative of the RVS and discusses the failure of its implementation in terms of health policy theory.
  • Understanding the specific difficulties of the RVS methodology, and the broader challenges of health care reform, may allow for more successful implementation of the current Medicare review.

In April 2015, the federal Minister for Health, Sussan Ley, announced several initiatives to improve the operation of Medicare. One of these is the formation of a Medicare Benefits Schedule (MBS) Review Taskforce, to align MBS services with “contemporary best clinical practice”.1 The taskforce was allocated $34 million in the 2015 federal Budget, and has been described as the “most comprehensive review of the MBS ever taken”.2

In light of this review (and ongoing federal government determination for health care savings), it is timely to consider the lessons from the failure of the last major collaboration between government and the medical profession to restructure health system funding. The Relative Value Study (RVS) was a 7-year collaboration between the Australian Medical Association (AMA) and the federal health departments in the 1990s. The RVS was a review of the services and fees of the existing MBS, in order to address perceived unfairness. The RVS cost the taxpayer over $7.8 million but was never implemented.

Funding of the Medicare Benefits Schedule

When the Medical (now Medicare) Benefits Schedule was introduced in July 1970, medical fees were set according to an indicative list of AMA-approved “most common fees” charged. Technological advances subsequently reduced time to perform many procedures, but the calculation of payment did not change. By the 1980s, there was growing pressure from general practitioners and public health professionals to reform the MBS to better reward consultation-based medicine and encourage health promotion. There was also discontent from doctors that Medicare rebates were not keeping pace with increasing practice costs or inflation.3 This discontent was heightened by a freeze on indexation of rebates in 1999.4

Further, there was concern (from funders and researchers) that medical costs were increasing and that better value for money was needed for health services.5 The “common fees” origin of the MBS was seen as potentially resulting in doctors being paid what the market would bear or what patients were willing to pay, rather than for the real value of health services.6,7

In response to similar concerns, researchers in the United States developed a method for estimating doctors’ work based on valuing the resources used in the production of their medical services.8 This process allowed production of a relative value scale to rank services according to costs, and formed the basis of Medicare payment reform in the US in 1992.9

By the 1990s, there was sufficient dissatisfaction with the MBS that a similar process was considered for Medicare in Australia by the AMA and the then Minister for Health, Carmen Lawrence.

The Relative Value Study in Australia

The AMA and the Commonwealth Department of Human Services and Health agreed to jointly review the relativities and funding of the MBS. The Medicare Schedule Review Board (MSRB) oversaw this process and comprised three representatives from each organisation. In 1994, a joint discussion paper stated that the “overall objective of the review is to set relativities of fees for private services covered by the MBS on a consistent, fair and workable basis”.6 Further objectives were to encourage a fee level so patients were not unduly out of pocket, and to provide a reliable basis for updating fees.

The RVS was a three-stage process (Box).

Stage 1

Stage 1 comprised a review of consultation item numbers (viewed as needing most urgent change) and proposed the introduction of time-tiered consultations for specialists, and an expansion of GP time tiers, resulting in an eight-tier system of consultation item numbers.5 Stage 1 also developed a costing formula for medical fees ($File/feesformula.pdf):

Medical fee = [Professional component] + [Practice cost component]

F = [Relative value of service × Doctor earning rate] + [Direct costs + Overheads + Professional indemnity costs + Working capital allowance]

The formula defined medical fees in terms of professional and practice costs. Professional cost was seen as the product of the relative value of a service (itself a combination of time, intensity and risk) and the relative value of the doctor, while practice costs were the sum of direct costs (such as staffing and consumables), general overheads for a reasonably efficient practice, professional indemnity costs and an allowance for working capital.

Stage 2

Stage 2 comprised three large technical studies completed in 2001, providing information for the costing formula.

The Practice Cost Study10 used modelling techniques to estimate practice costs incurred by reasonably efficient private medical providers, to populate the practice cost component of the costing formula.

The Professional Relativities Study11 developed two non-interchangeable scales of relativity: one valuing procedures and one valuing consultation services. The Australian Work Relative Value Unit (RVU) scale valued MBS procedures by assessing the time to complete (including pre- and post-service time) and work intensity (the combination of cognitive skill, technical skill and effort, and stress due to risk to patient or difficulty of procedure), and ranked all item numbers in a specialty in order of their total work value. The Attendance Item Work Relativities scale calculated relativities of consultations based on the eight-tier item number restructure from the Stage 1 report. These relativities considered length, location and intensity of consultation, amount of face-to-face time, and differentiated new and regular patients.

The Remuneration Rates Study12 calculated base GP remuneration rates (from a sample of 60 GPs), and developed career comparisons between GPs and specialists, and then between GPs and other “similar” professions, including lawyers, accountants and geologists.

Reception of the Stage 2 reports

There was criticism of some specifics of the technical reports. The Remuneration Rates Study reported a wide range of values for GP work value points and suggested GP career remuneration should be less than that for an accountant or lawyer but more than that for a geologist. The choice of “other professionals” was criticised as not reflecting the unpredictability or difficulty of GP work. The Practice Cost Study was criticised for including different office and support staff costs for GPs and specialists, not considering locum costs and for large differences in professional developments costs (annual GP allocation of $1256, versus $4394 for specialists).13

Importantly, after the completion of all the Stage 2 reports, there remained a number of assumptions that the technical researchers (and subsequently members of the MSRB) could not agree on. The proportion of appropriate non-face-to-face time for GPs and specialists could not be agreed, and the Professional Relativities Study produced two models of attendance item scales. The size of a reasonably efficient general practice was also not agreed, and therefore multiple costings for different-sized practices were provided in the Practice Cost Study. There was also failure to agree on appropriate continuing medical education costs and on the appropriate payment for an efficient GP — whether this should be the average of other professionals, or more (or less) than for the comparator professions.

Failure to agree on these assumptions meant that the MSRB was not able to agree on the value of the 15-minute GP attendance — the item number against which all other item numbers would be compared. The Department’s preferred value was 1.15 RVUs, whereas that of the GP members of the MSRB was 1.77 RVUs.

Modelling processes

Before the release of the Stage 2 reports, the AMA released modelling which suggested that a 15-minute GP consultation should be valued at $44 (based on 1.77 RVUs), and that full implementation of the RVS would cost an additional $1.5 billion. The Department’s modelling suggested a 15-minute GP consultation fee of $29.30, and predicted increased consultation costs of $154 million and a reduction in procedural costs of $96 million, for a net implementation cost of $58.3 million.

The difference between the two modelling figures was not resolved. As the MSRB was developing its recommendations on the review of fees in March 2001 (after over 70 meetings), the Board was disbanded by the then Minister for Health and Aged Care, Dr Michael Wooldridge.

Stage 3

Stage 3 aimed to link the data from the three technical studies to produce a new schedule of fees. The Department conducted an independent Stage 3 report based on its preferred assumptions, which was tabled to Senate Estimates in May 2001.14

The AMA modelling figures were used to advocate for increased health funding leading into the 2001 election. With the re-election of the Howard Government in November 2001 and the retirement of Dr Wooldridge, the RVS was not further funded in the 2002–03 Budget and implementation did not proceed.


One health policy theory suggests that for an issue to get onto the health policy agenda, consideration needs to be given to three streams: problems, policies and politics.15 These three streams usually flow independently but will occasionally come together to create a window of opportunity for health reform.

During the establishment of the RVS, the three streams were seen to converge. There was acknowledgement of a problem: fundamental unfairness of the MBS leading to increased provider and researcher dissatisfaction. From a policy perspective, there was agreement that a review of the system was needed and researchers had shown that an alternative system could be designed. There was also a political will for change, as evidenced by the considerable funding budgeted for the RVS.

Why did the RVS fail and what did we learn?

The fatal flaw in the RVS process was the initial failure to resolve the different objectives of the members of the MSRB. As the RVS progressed, it became apparent that the policy solutions would either decrease income to some procedural medical practitioners (not satisfactory to the AMA) or significantly increase Medicare costs (not satisfactory to the government).

Further, the failure of the AMA and the Department to agree on a number of technical assumptions made modelling difficult and implementation impossible. These disagreements prevented valuation of the base consultation and stopped linkage of the two scales from the Professional Relativities Study. Additionally, failure to agree on the career value of GPs prevented determination of the doctor earning rate for other specialties.

From a technical standpoint, the RVS methodology has been criticised for failing to measure the quality of outputs (such as effect on health outcomes) or the societal benefit of a procedure.6 A resource-based relative value study suggests that two procedures have the same value if they use the same resources. But is providing a hearing aid to an 80-year-old as valuable as removing a melanoma in a 30-year-old?

Also, the RVS formula described values in dollar terms, rather than in terms of pure relativities. If all the RVU activity was added together, and this was divided into the total budget, a value could then have been determined for each RVU. This would have at least allowed implementation of the relativities and recalibration of MBS values, regardless of disagreement about the level of remuneration.

Finally, although the broad objectives of the RVS may have encouraged initial collaboration, they may have contributed to overall failure of implementation. A similar issue lay at the heart of the Medicare Locals program (established with broad objectives to improve the health of the community, but criticised in an external review for lacking clear purpose16) and is also a consideration for the emerging Primary Health Networks.

Where have we come since the failure of the RVS?

Departmental modelling confirmed the view that the MBS better rewarded procedural medicine compared with consultative medicine.16 In response to this, incentives programs were designed to reward consultative medicine (particularly general practice), such as the Enhanced Primary Care (1999) and Strengthening Medicare (2004) programs.17,18

These incentive programs increased funding for general practice and provided a smaller-scale “no loser” policy solution — more acceptable to the AMA and more easily implemented by the government.

Some of the relativities developed in the RVS have been used by the Department when reviewing remuneration of particular specialties to ensure cost-neutral adjustments. In 2005, the then Minister for Health, Tony Abbott, suggested, “I certainly think it would be good if we could look again at the relative value study”.19 This did not occur, and major MBS reform stalled until Minister Ley’s announcement of the current review process.


The collaboration of the RVS failed because of a combination of political, technical and policy disagreements that could not be resolved.

Value remains in the RVS, at a minimum as a lesson about the difficulties of health care reform. The new MBS Review Taskforce can learn from this, and the previous technical work can be built on to develop a sustainable and dynamic MBS. As the MBS approaches 50 years, such a review is hardly overdue.

Box – Stages of the Relative Value Study in Australia







Report: Towards a Relative Value Study

Department and AMA

Stage 1


Relative value costing formula and eight-tier attendance item number restructure

Department and AMA

Stage 2: technical reports


Practice Cost Study: costs of running an efficient medical practice



Practice Relativities Study: relative value scale for procedures; and relative value scale for consultations

National Centre for Classification in Health


Remuneration Rates Study: determination of fair remuneration for GP; comparison between GP and specialist pay; and comparison between GPs and other professionals

Healthcare Management Advisors

Stage 3: modelling report

Not completed

Medicare Schedule Review Board dissolved in March 2001 (however, the Department conducted its own modelling report in May 2001)

Department (with some separate modelling by the AMA)

Department = federal Department of Health. AMA = Australian Medical Association.

Provenance: Not commissioned; externally peer reviewed.

  • Michael Wright

  • 1 University of Technology Sydney, Sydney, NSW
  • 2 Sans Souci Medical Practice, Sydney, NSW


This research was conducted at the Centre for Research Excellence in the Finance and Economics of Primary Care, which is supported by a grant from the Australian Government Department of Health.

Competing interests:

No relevant disclosures.


remove_circle_outline Delete Author
add_circle_outline Add Author

Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.