The impact of the performance-based research fund on the research productivity of New Zealand universities.

AuthorSmart, Warren


The introduction of the Performance-Based Research Fund (PBRF) has resulted in much greater scrutiny of the research activities of New Zealand universities. This study examines the impact of this greater scrutiny on the research productivity of the universities. The analysis shows that most universities exhibit a significant increase in productivity in the period following the introduction of the PBRF. This finding is corroborated through the use of a production function approach to model the research process in New Zealand universities. This showed that the number of research publications listed in the Web of Science (1) by university researchers is significantly higher following the introduction of the PBRF. Analysis of total research output data shows that this increase in Web of Science research publications has not been at the expense of other forms of research output.


Governments around the world are increasingly using performance-based funding to allocate resources for research at higher education institutions. New Zealand is no exception to this trend, with the introduction of the Performance-Based Research Fund (PBRF) in 2004 representing arguably the most significant change to the funding of tertiary institutions in New Zealand since the introduction of the equivalent full-time student (EFTS) funding system in 1991. It marks the first time that a substantial proportion of tertiary education funding from Vote Education (2) has been allocated based on institutional performance.

Although the main objective of the PBRF is to raise the average quality of research through rewarding excellence (Tertiary Education Commission 2004), the increased scrutiny the PBRF places on research performance is likely to have increased the quantity of research output at New Zealand universities. This study attempts to quantify this effect by analysing the impact of the PBRF in terms of stimulating research output, in the form of journal articles and reviews (3) listed in the Web of Science, at the eight New Zealand universities. (4)

To measure the impact of the PBRF on research productivity, this analysis uses a mix of quantitative approaches. Firstly, the number of articles and reviews listed in the Web of Science per full-time equivalent research staff is examined over a 10-year period to see if productivity increased in the period following the introduction of the PBRF. Then a production function approach is used to model the research process at the New Zealand universities. Within this framework, multiple regression analysis is applied to panel data for the eight New Zealand universities. An advantage of using regression analysis in this case is that it can control for other factors that influence research output, thereby helping to isolate the impact of the PBRF.

This article begins by outlining the history of performance-based funding of research in the tertiary education sector in New Zealand and outlines briefly how universities reported their response to the PBRF. The reasons for using the Web of Science to measure research output are discussed and the limitations that apply to the coverage of total research activity in this data set are explained. The productivity of university research staff is then examined, and the empirical model used to estimate the research production function is introduced. This is followed by a discussion of the results of the production function analysis, with a focus on the impact of the PBRF. Finally, some conclusions and future areas of analysis are presented.


The New Zealand government has expressed a desire to introduce performance-based funding of tertiary education research for more than a decade. In 1997 the publication of the Green Paper (Ministry of Education 1997) signalled the government's intention to change the system of funding research from one based on the number of student enrolments to one where a contestable fund would be used to distribute funding based on the quality of research at an institution. However, a change of government in late 1999, along with concerns expressed by the universities at the size of the contestable fund and lack of operational detail, resulted in the proposed approach being deferred (Boston 2006).

The proposal for performance-based funding for research was revived by the Tertiary Education Advisory Commission (2001) report Shaping the Funding Framework. The report recommended the introduction of a "Performance-Based Research Fund", based on a mixed model of peer review and performance Indicators, (5) to assign funding based on the quality of research at an institution. The Cabinet signed off on the decision to go ahead with the PBRF in May 2002, with the operational detail of the PBRF being outlined in late 2002. (6) The PBRF began allocating funding in 2004 as funding from enrolments-based research top-ups was phased out. In 2007, the year in which the transition from the research top-ups to the PBRF was completed, the PBRF is estimated to have allocated around $231 million in funding to participating tertiary institutions (Tertiary Education Commission 2007).

Sixty percent of the funding allocated via the PBRF is based on the results of the Quality Evaluation, which uses peer review to evaluate the quality of research by PBRF-eligible staff at participating tertiary institutions. Peer reviewers evaluate researcher performance across three dimensions: the quality of research output, the esteem with which the researcher is held by their peers and their contribution to the research environment. In generating a final quality category, the quality of research output has the highest weighting.

The top quality category A is assigned to a researcher who produces research that is assessed as being of international quality. This is followed in order by B, C and R quality categories. Funding is only allocated to those researchers who receive a minimum of a C quality category, with A researchers receiving the highest weighting of funding.

Importantly, the results of the Quality Evaluation are published by the Tertiary Education Commission. Therefore, there is a strong incentive for universities to improve the quality of their research output, not only to maximise the funding received via the PBRF but also to maximise the positive impact from a high ranking.

An examination of university profiles published shortly after the release of the first PBRF Quality Evaluation results in 2004 illustrates how the PBRF has influenced the management of the research process at the universities. A number of universities stated that the results of the first PBRF Quality Evaluation would be used to identify areas of research strength and also those areas that required additional support to improve the level of quality of research (University of Auckland 2004, Massey University 2004, University of Canterbury 2004, Victoria University of Wellington 2004). Having identified those areas requiring additional support, the universities indicated that improvements would be made through helping research-inactive staff to improve their performance (Massey University 2004) and/or by recruiting research-active staff (Auckland University of Technology 2004, University of Auckland 2004).

The examination of the university profiles also shows that a number of universities used PBRF measures explicitly in setting goals. For example, the University of Waikato stated they wanted to build on their 2003 Quality Evaluation results (University of Waikato 2004), while the University of Canterbury stated an explicit long-term goal of being New Zealand's top university for quality of research, as measured by the PBRF (University of Canterbury 2004).

Given the response of the universities to the PBRF, it seems likely there would be an associated increase in the volume of research activity, especially by those researchers seeking to improve their quality category. International experience also suggests that the introduction of performance-based funding of research increases research activity. Liefner (2003) interviewed professors from a number of prestigious universities around the world on aspects of the funding of research. There was broad agreement that the introduction of performance-based funding leads to an increase in research activity, and hence quantity and quality, as a result of the increased scrutiny of performance. An evaluation of the impact of the United Kingdom's (UK) Research Assessment Exercise (RAE), which, like the PBRF, involves peer assessment of research quality, observed that researchers targeted journal publication as a means of raising their level of research quality (McNay 1998). This was based on a perception that publishing in highly cited journals would be viewed favourably by the review panels (Elkin 2001).

If New Zealand researchers responded in a similar fashion, it would be shown by an increase in the number of journal articles and reviews published in the years following the decision to introduce the PBRF, and also in journals likely to be more highly cited.


A key problem in measuring the research output of universities in New Zealand is a lack of consistency in the way the universities report their research output...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT