Performance-based research fund--implications for research in the social sciences and social policy.

AuthorBoston, Jonathan

Abstract

This paper discusses the outcome and likely implications of the new Performance-Based Research Fund (PBRF) in New Zealand, with particular reference to the results for, and possible impact upon, the social sciences. The paper begins with a brief outline of the rationale for, and design of, the PBRF and then examines the outcome of the first Quality Evaluation of research in the tertiary education sector, conducted in 2003. The paper includes consideration of the relative performance of the eight main social science subject areas, together with an analysis of how these subject areas compare with the other 33 disciplinary groupings (drawing upon a number of data sources in addition to the PBRF results). Attention is also given to what the 2003 Quality Evaluation reveals about the demographic structure of New Zealand's academic community, with particular reference to the social sciences. The paper concludes by exploring the possible implications of the PBRF for the funding and conduct of research in the social sciences. The analysis indicates that although the research performance of the social sciences is generally superior to the average across the 41 subject areas assessed, the introduction of the PBRF is likely to result in the social sciences suffering a small net loss of funding. Nevertheless, the best performing of the eight social science subject areas, such as psychology, stand to gain in relative terms, as do the two universities with the highest quality scores--Auckland and Otago.

INTRODUCTION

This paper discusses the design, results and likely implications of the new Performance-Based Research Fund (PBRF) in New Zealand, with particular reference to the results for, and possible impact upon, the social sciences. The paper begins with a brief outline of the rationale for the PBRF and a description of the three components of the fund--the Quality Evaluation, the Research Degree Completions (RDC) measure and the External Research Income (ERI) measure. Consideration is then given to the outcome of the first Quality Evaluation of research in the tertiary education sector, which was conducted in 2003.

Having examined the overall results for time tertiary sector as a whole, the paper analyses the performance of the social sciences. This includes comparisons of the relative performance of the various social science disciplines, together with an analysis of how the social sciences compare with other disciplinary groupings (drawing upon a number of data sources in addition to the PBRF results). Attention is also given to what the 2003 Quality Evaluation reveals about the demographic structure of New Zealand's academic community, with particular reference to the social sciences. The paper concludes by exploring the possible implications of the PBRF for the funding and conduct of research in the social sciences. Significant attention to the RDC and ERI measures within the PBRF is outside the scope of this paper.

THE DESIGN AND IMPLEMENTATION OF THE PBRF

Internationally, there has been growing recognition of the importance to a nation's economic and social advancement of a high performing research sector. That recognition has led governments in many countries to reform the systems for managing and funding the research activities of higher education institutions. (1) The public funding for research in the tertiary education sector in New Zealand has for many decades been largely delivered as a component of student tuition subsidies; it has thus been significantly dependent upon the structure and volume of student demand. During the 1990s, various critics argued that funding research on such a basis did not encourage research excellence; nor did it ensure that high-calibre researchers received adequate resources. (2) In 1998, the government proposed the establishment of a separate contestable research pool for the tertiary sector, but the idea received a mixed reception and was not implemented.

Subsequently, in late 2001, the Tertiary Education Advisory Commission (TEAC), created by a new government to advise it on tertiary education matters, proposed the establishment of a Performance-Based Research Fund (PBRF). Under the proposed model, the research "top-up" component of tuition subsidies would be placed in a separate fund and allocated to eligible tertiary education organisations (TEOs) via a new performance-based funding formula. In accordance with this formula:

* 50% of the available funding would be allocated on the basis of the results of periodic assessments by expert panels of the quality of the research produced by eligible staff in participating TEOs

* 25% would be based on the volume of RDC (with cost weightings and research component weightings for different types of research degrees)

* 25% would be based on the volume of ERI.

The suggested combination of peer review and performance indicators led to the PBRF being referred to as a "mixed model". In this context, the proposed approach differed from the "pure" indicator models used in Australia and Israel and the "pure" peer review models employed in Britain and Hong Kong. Equally important, unlike the British Research Assessment Exercise (RAE) where the unit of assessment is a discipline-based department or school, TEAC proposed that the unit of assessment be individual staff members (as in Hong Kong) (see Boston 2002).

The government endorsed the broad concept of a PBRF in mid-2002. But in December 2002, on the advice of a sector working group, it made a number of changes to the scheme envisaged by TEAC. For instance, the weighting placed upon the peer assessment component was increased to 60%, while the weighting of the ERI component was reduced to 15%. Also, in keeping with the Working Group's recommendations, a much more comprehensive approach to the assessment of research quality was approved. This entailed an evaluation of Evidence Portfolios (EP) prepared by each eligible staff member, with each EP providing details of the author's research outputs, peer esteem and contribution to the research environment during the preceding six years. It was agreed, in accordance with the working group's advice, that the first so-called "Quality Evaluation" of research in the tertiary education sector would be conducted in 2003, with a second planned for 2006. Beyond this, Quality Evaluations would be held every six years. Under the new policy, funding via the PBRF would be phased in during 2004-2007, reaching a total of about $175 million in 2007. The newly established Tertiary Education Commission (TEC) was given the task of implementing the PBRF.

Under the assessment regime developed for the first Quality Evaluation, participating TEOs conducted an initial internal evaluation of the EPs of their respective PBRF-eligible staff members - this included most academic staff and a limited number of non-academic staff (e.g. post-doctoral fellows) (see Hall et al. 2003). Each EP was assigned one of four possible Quality Categories ("A", "B", "C" and "R"). Those nominated an "A", "B" or "C" were submitted, in late September 2003, to TEC for assessment by one of the 12 peer review panels. These panels, in turn, made their own judgement concerning the quality of each submitted EP and assigned an appropriate Quality Category. Such judgements were based on clear standards and guidelines established by TEC. The decisions of each panel were subject to the oversight of a Moderation Panel consisting of the 12 panel chairs and an independent chair.

In the event, 22 of the 45 PBRF-eligible TEOs participated in the 2003 Quality Evaluation, including all of New Zealand's eight universities. Of the 8,013 PBRF-eligible staff in these 22 TEOs, 5,771 had their EPs assessed by a peer review panel. The remainder were automatically assigned an "R". Although 23 eligible TEOs did not participate, it is very likely--given the level of research output in the non-participating TEOs and given the results of academic researchers in those that did--that the 2003 Quality Evaluation included the vast majority of research-active staff within New Zealand's tertiary education sector.

OVERALL RESULTS OF THE 2003 QUALITY EVALUATION

The published results of the 2003 Quality Evaluation provided a research "quality profile" for each participating TEO, subject area and nominated academic unit (see TEC 2004a). The profile furnished data on the number of PBRF-eligible staff--on both a headcount and a full-time equivalent (FTE) basis--in the relevant "grouping", the number and proportion of As, Bs, Cs and Rs, and a quality score (out of a maximum possible of 10). In order to calculate the quality score, weightings were assigned to the four Quality Categories based on the PBRF funding formula: "A" (5), "B" (3), "C" (1) and "R" (0). The weighted scores were then multiplied by 2 and divided by the total number of eligible staff in the relevant grouping. To secure the maximum score of 10, all the members of the relevant grouping would need to have been assigned an "A" Quality Category

The overall results of the 2003 Quality Evaluation are outlined in Table 1. Significantly, the peer review panels assigned relatively few high Quality Categories, with only 5.7% of PBRF-eligible staff receiving an "A" and 23.2% a "B". Altogether, almost 40% were assigned an "R". As might be expected, the results of the internal assessment conducted by TEOs were somewhat more favourable, with 11.9% of staff nominated an "A" and only 26.9% an "R".

The overall quality score for the tertiary sector as a whole (i.e. the 22 participating TEOs) was 2.59 (FTE-weighted). This was lower than many expected, and lower than the results of the internal TEO assessment undertaken (which yielded a quality score of 3.49). The low average quality score and the relatively low proportion of staff assigned an "A" or "B" reflected a variety of factors, including:

* the exacting nature of the criteria for achieving a high...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT