Analytic Quality Glossary

 

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z Home

 

Citation reference: Harvey, L., 2004-24, Analytic Quality Glossary, Quality Research International, http://www.qualityresearchinternational.com/glossary/

This is a dynamic glossary and the author would welcome any e-mail suggestions for additions or amendments. Page updated 8 January, 2024 , © Lee Harvey 2004–2024.

 

Recipes

   

_________________________________________________________________

Research Assessment Exercise


core definition

The RAE is a process, in the UK and Hong Kong that assesses the quality of research to enable the higher education funding bodies to distribute public funds on the basis of research quality ratings.


explanatory context

In the UK, those institutions who conduct the best research receive a higher proportion of funding so that those institutions who conduct high-quality research are rewarded, protected and developed.  The RAE was carried out every few years by the four UK funding bodies. The RAE that took place in 2001, for example, was used to distribute £5 billion of research funds (HEFCE, 2003).

 

The RAE provided quality ratings for research across all disciplines using a standard scale ranging from 1 to 5 (although with sub categories there seven grades). Grades are determined by how much of the work is judged by a peer panel to reach national or international levels of excellence.

 

The outcomes of the RAE are published and as such provide public information about the quality of research in universities and colleges throughout the UK.  It is useful for industry, commerce and the voluntary sector to guide their research funding decisions. The RAE also gives an indication of the relative quality and standing of UK academic research and provides benchmarks that are used by institutions in developing and managing their research strategies. However, the RAE tends to lead to concentration of research and, it is argued, a proliferation of research papers. (HERO, 2001)


The RAE has been replaced, in the UK, by a similar process call the Research Excellence Framework (REF). The REF was undertaken by the four UK higher education funding bodies and completed in 2014. The primary purpose claimed for the REF is to produce assessment outcomes for each submission made by institutions. According to HEFCE (no date) The funding bodies intend to use the assessment outcomes to inform the selective allocation of their research funding to HEIs, with effect from 2015-16. The assessment provides accountability for public investment in research and produces evidence of the benefits of this investment. The assessment outcomes provide benchmarking information and establish reputational yardsticks.

 

In Hong Kong, the first Research Assessment Exercise (RAE), which was undertaken in January 1994, aimed to assess the research output performance of the UGC-funded institutions by cost centre and the results were used as the basis for allocating some of the research portion of the institutional recurrent grant for the triennium 1995-98. For this exercise, research was broadly defined to include, in addition to traditional academic research outputs, contract research, art objects, performances, designs and other creative works. Hong Kong  (UGC, 2004)

 

New Zealand, Denmark and Australia have similar forms of research assessment.


analytical review

The Higher Education Funding Council for England (HEFCE) (2003) defined the RAE as:

Research Assessment Exercise (RAE): The process of assessing the quality of research for funding purposes. The RAE is carried out every few years by the four UK funding bodies. The most recent RAE took place in 2001.

 

The RAE2008 site stated:

RAE 2008 is the sixth in a series of exercises conducted nationally to assess the quality of UK research and to inform the selective distribution of public funds for research by the four UK higher education funding bodies.

RAE 2008 will provide quality profiles for research across all disciplines. Submissions from institutions will be assessed by experts in some 70 units of assessment. The main body of the assessment will take place in 2007–08, with outcomes to be published by the funding bodies in December 2008.

 

The University of Leicester (2001), School of Archaeology & Ancient History states:

The RAE is a periodic examination of the research achievements and quality of university departments across the UK. The following is quoted from the central RAE 2001 website:

‘The main purpose of the Research Assessment Exercise (RAE) is to enable the higher education funding bodies to distribute public funds for research selectively on the basis of quality. Institutions conducting the best research receive a larger proportion of the available grant so that the infrastructure for the top level of research in the UK is protected and developed. The RAE assesses the quality of research in universities and colleges in the UK. It takes place every four to five years and the next exercise will be held in 2001. Around £5 billion of research funds will be distributed in response to the results of the 2001 RAE.

‘The RAE provides quality ratings for research across all disciplines. Panels use a standard scale to award a rating for each submission. Ratings range from 1 to 5* [the highest], according to how much of the work is judged to reach national or international levels of excellence. Higher education institutions (HEIs) which take part receive grants from one of the four higher education funding bodies in England, Scotland, Wales and Northern Ireland.’

 

The Australian Research Council (2015) states:

Excellence in Research for Australia is an assessment system that evaluates the quality of the research conducted at Australian universities. The objectives of ERA are to:

1. establish an evaluation framework that gives government, industry, business and the wider community assurance of the excellence of research conducted in Australian higher education institutions
2. provide a national stocktake of discipline level areas of research strength and areas where there is opportunity for development in Australian higher education institutions
3. identify excellence across the full spectrum of research performance
4. identify emerging research areas and opportunities for further development
5. allow for comparisons of research in Australia, nationally and internationally, for all discipline areas.

ERA measures performance within each discipline at each university and gives us a detailed view of the research landscape in Australia, from quantum physics to literature. It highlights national research strengths in areas of critical economic and social importance—such as Geology, Environmental Science and Management, Nursing, Clinical Sciences, Materials Engineering, Psychology, Law and Historical Studies and many others. In addition, ERA results highlight the research strengths of individual universities. The ERA data presented in each National Report also provides contextual information about research application, knowledge exchange and collaboration.

The next round of ERA will take place in 2018.


associated issues

In a report of research dated 26 June 2014, titled 'Performance-based research assessment is narrowing and impoverishing the university in New Zealand, UK and Denmark', Susan Wright, Bruce Curtis, Lisa Lucas and Susan Robertson comment on Performance-based research assessments (PBRAs) in the UK, New Zealand and Denmark. They note that PRBSs act as a quality check, a method of allocating funding competitively between and within universities, and a method for governments to steer universities to meet what politicians consider to be the needs of the economy. Four main points emerge from their work.

Narrowing of the Purpose of the University

PBRAs gained renewed purpose when governments accepted the arguments of the OECD and other international organisations that, in a fast approaching and inevitable future, countries had to compete over the production and utilisation of knowledge and in the market for students (Wright 2012). Governments saw universities as the source of these new raw materials, and PBRAs became important mechanisms to steer universities in particular directions. However, they are quite a blunt instrument: PBRAs’ assessment methods prioritise ‘academic’ publications, which have notoriously few readers but which are heavily weighted in global rankings of universities. This focus is therefore appropriate where governments aim for their universities to claim ‘world class’ status in order to attract global trade in students. However, such an instrument steers academic effort away from other purposes of the university, which might also be part of government’s aims, for example transferring ideas to industry or more widely contributing to social debates and democracy. In all cases, PBRAs capture only certain aspects of the university, with the danger of narrowing and impoverishing of the mission of the university.

Glorification of Leaders

Just as measures become targets, so such steering tools become the main rationale of management and are used by them to reshape the university. One of the points raised in discussion at the URGE symposium was how governments’ steering of universities through such measures relies on enhancing the powers of leaders. Lucas (2006; 2009) has shown how the history of the UK’s Research Assessment Exercise (RAE) is paralleled by the emergence of a managerial class to control the university’s performance. Robertson’s case study records how yet another new administrative apparatus was developed to advise and quality control academics in the devising and writing of ‘impact’ case studies for the Research Excellence Framework (REF, which replaced the previous RAE). These systems of steering universities have not only contributed to what in the U.S. is called universities’ ‘administrative bloat’ (Ginsberg 2011) but also what was referred to in the URGE symposium as the ‘glorification’ of vice chancellors. When university managers’ Key Performance Indicators in New Zealand and the UK are based on improving their university’s status in national and global rankings, they become organisational imperatives. A new language has emerged that speaks of the violence involved in the RAE, for example, ‘cutting off the tail’ of departments – getting rid of academics who, regardless of any other qualities and contributions, score low in RAE-able publications. In New Zealand, the PBRA rationale has not taken over the life of the university so compulsively and other narratives about the purpose of the university are still available.

Myths of the Level Playing Field

PBRAs are accompanied by rhetoric that their standardised metrics obviate favouritism and install meritocratic advancement. It was argued at the URGE symposium that before there used to be baronial departments and only the head of department’s (usually male) cronies succeeded. Now, the argument goes, there are clear criteria for promotion and funding, and all can strategise, individually, to succeed. Such transparent criteria should lead to both excellence and equity. Yet, the new metric for promotion fetishises external funding and Curtis’ analysis also reveals that the PBRF systematically disadvantages women, those trained in New Zealand, and those studying New Zealand issues. In the UK, the RAE also systematically disadvantages women.

Robertson’s analysis of the shift from RAE to REF in the UK clearly shows the systematic disadvantages of different systems. Subjects like nursing, public policy and some humanities, which had done badly under the RAE’s focus on academic publications were now good at demonstrating ‘impact’ in the REF. For these subjects, the income from REF ‘impact’ would make a considerable difference whereas, for some other subjects, such as engineering, the cost in academic time to put together REF cases demonstrating their undoubted ‘impact’ would not yield sufficient returns, compared to their other sources of income.

Dangerous Coherence

PBRAs act as tools of governance when their definition of ‘what counts’ pervades government steering, university management and academic identity formation (Wright forthcoming). Unambiguous definitions of what counts provide clear messages to university staff and managers who then act accordingly and perhaps not in line with other government indicators. The recent inclusion of ‘impact’ in the REF reflected governmental concern that the previous RAE’s primary focus on each academic producing four articles in top journals had eroded the capacity for staff to provide policy advice. The inclusion of impact broadens and complicates ‘what counts’. In this respect Curtis (2007; under review) has noted how the PBRF in New Zealand provides mixed messages to university managers. New Zealand universities also have a legal obligation to be the ‘critic and conscience of society’. Similarly Danish universities have a legal obligation to engage with and disseminate their research to ‘surrounding society’. Both would have the potential to diversify ‘what counts’ if performance and funding measures were devised in keeping with their legal obligations. Hopefully, the UK’s quest for ‘impact’ will have a wider impact, of unmasking the operations of PBRAs as political technologies and their role in a pervasive form of governance that is narrowing and impoverishing the public purpose of the university.

This piece is based on the findings in the 2014 working paper Research Assessment Systems and their Impacts on Academic Work in New Zealand, the UK and Denmark, Working Papers in University Reform no. 24. Copenhagen: DPU, Aarhus University, April.


related areas

See also

accountability

assessment

quality


Sources

Australian Research Council, 2015, 'ERA FAQs', available at http://www.arc.gov.au/era-faqs, accessed 10 January 2017, not available 15 May 2022.

Higher Education and Research Opportunities in the United Kingdom (HERO) 2000, http://www.hero.ac.uk/sites/hero/rae/Pubs/5_99/ByUoA/crit60.htm, last updated 17 April 2000, not available at this address 5 March 2011.

Higher Education Funding Council for England (HEFCE), 2003, About us: Glossary  http://www.hefce.ac.uk/glossary/glossary.htm Updated 3 January 2003, no longer avialable.

Higher Education Funding Council for England (HEFCE ), no date, Research Excellence Framework  http://www.hefce.ac.uk/research/ref/, accessed 24 January 2012, not available 15 May 2022.

RAE2008, 2004, Research Assessment Exercise: What is the RAE 2008? http://www.rae.ac.uk/default.htm, no longer available at this address 24 January 2012, see review at http://www.rae.ac.uk/, accessed 20 September 2012, not available 15 May 2022.

University Grants Committee [Hong Kong], 2004, Quality Assurance of Research http://www.ugc.edu.hk/chinese/documents/papers/kentlq.html, accessed October 2004, not available 24 January 2012.

University of Leicester, 2001, School of Archaeology & Ancient History, What is the Research Assessment Exercise? http://www.le.ac.uk/ar/rae.htm, last updated: 11 December 2001, not available 24 January 2012.

Wright, S., Curtis, B., Lucas, L. and Robertson, S., 2014, 'Performance-based research assessment is narrowing and impoverishing the university in New Zealand, UK and Denmark', available at http://blogs.lse.ac.uk/impactofsocialsciences/2014/06/26/research-assessment-impact-new-zealand-uk-denmark/, accessed 9 January 2017, still available 15 May 2022.


copyright Lee Harvey 2004–2024



Top

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z Home