Analytic Quality Glossary


A B C D E F G H I J K L M N O P Q R S T U V W X Y Z Home


Citation reference: Harvey, L., 2004-17, Analytic Quality Glossary, Quality Research International,

This is a dynamic glossary and the author would welcome any e-mail suggestions for additions or amendments. Page updated 2 January, 2017 , © Lee Harvey 2004–2017.


A fast-paced novel of conjecture and surprises



core definition

Benchmarking is a process that enables comparison of inputs, processes or ouputs between institutions (or parts of institutions) or within a single institution over time.

explanatory context

Benchmarking, in practice, tends to be more about sharing good practice than undertaking formal comparative measurements. However, the term has a wide range of meanings especially in the higher education context. According to Schofield (1998, pp. 11–12), for example, the term is used in diverse ways:

Massaro identities one aspect of the problem in that “the term is used fairly loosely to cover qualitative comparisons, statistical comparisons with some qualitative assessment of what the statistics mean, and the simple generation of statistical data from a variety of sources which are then published as tables with no attempt at interpretation... [and] Wragg in his description in Chapter 7 of the Commonwealth ‘Benchmarking Club’ sees one of the advantages of the co-operative methodology that was adopted in that approach as leading to “a true benchmarking process, ie in the absence of predetermined benchmarks, the aim is to establish benchmarks through the process... which can themselves be used in future to guide management in the quest for continuous improvement.

analytical review

Campbell and Rozsnyai's (2002, p. 131) definition is:

Benchmarking: Setting levels against which quality is measured or a process of identifying and learning from good practice in other organizations.


The Quality Assurance Agency for Higher Education (QAA) (undated) has a similarly terse definition, which is rather closer to the concept of benchmark rather than the benchmarking process:

Benchmarking: A term used to describe a standard against which comparisons can be made.

Higher Education Funding Council for England (HEFCE ) (undated) defines benchmarking as:

A process through which practices are analysed to provide a standard measurement ('benchmark') of effective performance within an organisation (such as a university). Benchmarks are also used to compare performance with other organisations and other sectors.

European Commission, Education and Training (2008) defines benchmarking in a limited way:

A standardised method for collecting and reporting critical operational data in a way that enables relevant comparison of the performances of different organisations or programmes, often with a view to establishing good practice.


The UNESCO definition of benchmarking is:

A standardized method for collecting and reporting critical operational data in a way that enables relevant comparisons among the performances of different organizations or programmes, usually with a view to establishing good practice, diagnosing problems in performance, and identifying areas of strength. Benchmarking gives the organization (or the programme) the external references and the best practices on which to base its evaluation and to design its working processes.

Benchmarking is also defined as:

– a diagnostic instrument (an aid to judgments on quality);

– a self-improvement tool (a quality management/quality assurance tool) allowing organizations (programmes) to compare themselves with others regarding some aspects of performance, with a view to finding ways to improve current performance;

– an open and collaborative evaluation of services and processes with the aim of learning from good practices;

      a method of teaching an institution how to improve;

      an on-going, systematically oriented process of continuously comparing and measuring the work processes of one organization with those of others by bringing an external focus on internal activities. (Vlãsceanu et al., 2004, p. 25)


For AEC (2004) benchmarking is more restricted and quite specific:

A process by which standards are set in terms of levels of challenge and typical content for a given award (e.g. a Bachelor degree in music).


For HEQC (2004, p. 26):

Benchmarking: A process by which an institution, programme, faculty, school, or any other relevant unit evaluates and compares itself in chosen areas against internal and external, national and international reference points, for the purposes of monitoring and improvement.


Karjalainen (2003) has an interesting take on benchmarking:

In the literature benchmarking has many definitions. I have divided these definitions to three categories: practical definitions, existential definitions and metaphorical definitions.

Practical definitions tell, through prose, what benchmarking is or what activities it includes:

Benchmarking: is the systematic study and comparison of a company’s key performance indicators with those of competitors and others considered best-in-class in a specific function. (Dervitsiotis, 2000);


is a way of comparing a product or process against others, with reference to specified standards. (Pepper, Webster & Jenkins 2001) [not referenced in article];

Existential definitions try to connect benchmarking with the experiences, emotions and basic processes of the human existence. These definitions bring the method closer to our ordinary living world. They suggest that benchmarking is only a more formalized dimension of our natural everyday interaction:

it is, at bottom, a systematic way of learning from others and changing what you do. (Epper 1999)

It is actually a matter of imitating successful behaviour. (Karlöf & Östblom 1993)

Benchmarking is a form of human being’s natural curiosity with which s/he explores the possibilities of cooperation and friendship. (Karjalainen, Kuortti & Niinikoski 2002)

Benchmarking is a learning process, which requires trust, understanding, selecting and adapting good practices in order to improve. (One team in ENQA workshop 2002) [not referenced in article].

So far there are no really strong metaphorical benchmarking definitions. This indicates that researchers, consultants, managers and other benchmarking users merely see the method as a technical question. Metaphorical definitions, by using metaphorical expressions, could provide new and astonishing perspectives. They could provide a surprising and a revelatory angle to the nature of benchmarking or give a sudden insight to the inner meanings of the method. State of mind of an organization is an example of a weak metaphorical expression:

it is the state of mind of an organization which encourages the continuous effort of comparing functions and processes with those of best in class, wherever they are to be found. (Zairi & Leonard 1994)

But why would we not develop stronger ones? What if we called benchmarking ‘the shortcut through the forest of the quality assessment’, ‘the flower of the organisational curiosity’ or ‘the envious energy between the managers’? Each of these metaphors implies a very different benchmarking concept and process. (Karjalainen, 2003, pp. 8–9).


The Scottish Higher Education Enhancement Committee (2009) opted for the following definition of benchmarking:

What do we mean by 'benchmarking'? Jackson and Lund (2000) suggested a working definition for benchmarking in UK higher education which encompasses both development and accountability: '...a process to facilitate the systematic comparison and evaluation of practice, process and performance to aid improvement and regulation.' They add that benchmarking is: '...first and foremost, a learning process structured so as to enable those engaging in the process to compare their services-activities-processes-products-results in order to identify their comparative strengths and weaknesses as a basis for self-improvement and/or self-regulation. Benchmarking offers a way of not only doing the same things better but of discovering 'new, better and smarter' ways of doing things and in the process of discovery, understanding why they are better or smarter.' The Working Group embraced this definition, particularly the emphasis upon learning from the process. In the context of its work, the Group defined benchmarking as identifying, considering, comparing and learning from developing practice in Scotland and internationally, and set about actioning this.

associated issues

Vlãsceanu et al., (2004, p. 26–28) also point out that benchmarking ‘implies specific steps and structured procedures’ and that there are different types of benchmarking depending on what data is compared. They specify the following:

strategic benchmarking (focusing on what is done, on the strategies organizations use to compete);

operational benchmarking (focusing on how things are done, on how well other organizations perform, and on how they achieve performance),

 data-based benchmarking (statistical bench-marking that examines the comparison of data-based scores and conventional performance indicators)…

Internal Benchmarking: Benchmarking (comparisons of) performances of similar programmes in different components of a higher education institution. Internal benchmarking is usually conducted at large decentralized institutions in which there are several departments (or units) that conduct similar programmes.

(External) Competitive Benchmarking: Benchmarking (comparisons of) performance in key areas, on specific measurable terms, based upon information from institution(s) that are viewed as competitors.

Functional (External Collaborative) Benchmarking: Benchmarking that involves comparisons of processes, practices, and performances with similar institutions of a larger group of institutions in the same field that are not immediate competitors.

Trans-Institutional Benchmarking: Benchmarking that looks across multiple institutions in search of new and innovative practices, no matter what their sources.

Generic Benchmarking: Compares institutions in terms of a basic practice process or service (e.g., communication lines, participation rate, and drop-out rate). It compares the basic level of an activity with a process in other institutions that has similar activity.

Process–Based Benchmarking: Goes beyond the comparison of data-based scores and conventional performance indicators (statistical benchmarking) and looks at the processes by which results are achieved. It examines activities made up of tasks, steps which cross the boundaries between the conventional functions found in all institutions.

Implicit Benchmarking: A quasi-benchmarking that looks at the production and publication of data and of performance indicators that could be useful for meaningful cross-institutional comparative analysis. It is not based on the voluntary and proactive participation of institutions (as in the cases of other types), but as the result of the pressure of markets, central funding, and/or co-ordinating agencies. Many of the current benchmarking activities taking place in Europe are of this nature.

Within different types, benchmarking may be either vertical (aiming at quantifying the costs, workloads, and learning productivity of a predefined programme area) or horizontal (looking at the costs of outcomes of a single process that cuts across more than one programme area).


Vlãsceanu et al., (2004, p. 26–27) also identify early examples of benchmarking:

·      National Association of Colleges and University Business Officers(NACUBO) Benchmarking Project started in 1991–92 and has had a statistical and financial approach to benchmarking.

·      The History 2000 Project, led by Paul Hyland, School of Historical and Cultural Studies, Bath College of Higher Education is an example of benchmarking of academic practice,

·      The Royal Military College of Science (RMCS) Programme at Cranfield University is an example of benchmarking in libraries;

·      The Higher Education Funding Council for Higher Education (HEFCHE) Value for Money Studies (VfM), launched in 1993;

·      “The Commonwealth University International Benchmarking Club”, launched in 1996, by Commonwealth Higher Education Management Service (CHEMS), as an example of international benchmarking;

·      The Copenhagen Business School (CBS) benchmarking analysis of twelve higher education institutions, 1995;

·      The German Benchmarking Club of Technical Universities (BMC), 1996;

·      The CRE “Institutional Quality Management Review” based on peer reviews and mutual visits to universities participating voluntarily in a cycle, each time focusing on a specific issue, is an example of implicit benchmarking (CHEMS, 1998).


 Karjalainen (2003, p. 8) also contributes to the debate between 'true' and 'false' benchmarking:

In true benchmarking organisations and people learn from each other and there is dialogue. It has explicit and open goals and the decision-making process is (as) clear (as possible). True benchmarking is always creative. Adapting best practices does not mean the same as copying them. False benchmarking is rank-oriented or merely explorative without interest in improvement. It has hidden purposes and it may even be spying. Nor is touristy visiting true benchmarking. Fuzzy goals and undefined processes are typical false benchmarking constituents. Performance measurement by using some benchmarks moves into true benchmarking when it defines targets for improvement by identifying best practices and adapting them to achieve continuous improvement in one’s own organization.

related areas

See also


benchmark statement


Association europeenne des conservatoires [Academies de musique et musikhochschulen] (AEC), 2004, Glossary of terms used in relation to the Bologna Declaration,, accessed September 2004. Not available at this address 31 January 2011.

Campbell, C. & Rozsnyai, C., 2002, Quality Assurance and the Development of Course Programmes. Papers on Higher Education Regional University Network on Governance and Management of Higher Education in South East Europe Bucharest, UNESCO.

Commonwealth Higher Education Management (CHEMS), 1998,. Benchmarking in Higher Education: An International Review. Twente: CHEMS.

Council on Higher Education, Higher Education Quality Committee (HEQC), 2004, Criteria for Institutional Audits, April (Pretoria, Council on Higher Education).

Dervitsiotis, K. N., 2000, 'Benchmarking and business paradigm shifts',Total Quality Management. 11, pp. 641–46.

Epper, R., 1999, 'Applying benchmarking to higher education', Change, pp. 24–31.

European Commission, Education and Training, The Lifelong Learning Programme 2007–2013, 2008, Glossary, available at last update: 11 April 2008, accessed 1 March 2011, page not available 31 December 2016.

Higher Education Funding Council for England (HEFCE), undated, Glossary, available at, accessed 31 December 2016.

Karjalainen, A., 2003, ‘Benchmarking in brainstorming' in Hämäläinen, K., Dørge Jessen, A., Kaartinen-Koutaniemi, M., Kristoffersen, D., Benchmarking in the Improvement of Higher Education, ENQA Workshop Reports 2, Helsinki, European Network for Quality Assurance in Higher Education, 2002, printed 2003, available at, accessed 7 February 2011, Page not available 31 December 2016.

Karjalainen, A., Kuortti, K. & Niinikoski, S., 2002, Creative Benchmarking. University of Oulu & Finnish Higher Education Evaluation Council. University Press of Oulu.

Karlöf, B. & Östblom, S., 1993, Benchmarking: a signpost to excellence in quality and productivity. Chihester Wiley.

Quality Assurance Agency for Higher Education (QAA), undated, Acronyms and glossary of main terms, available at, accessed 29 January 2011, not at this address 6 February 2012.

Schofield, A., 1998, ‘Benchmarking: an overview of approaches and issues in implementation' in UNESCO, 1998, in New Papers on Higher Education 21 Studies and Research Benchmarking in Higher Education A study conducted by the Commonwealth Higher Education Management Service, ED98/WS, Paris, available at, accessed 7 February 2011, still available 31 December 2016.

 Scottish Higher Education Enhancement Committee, 2009, International Benchmarking Working Group. Supporting student success: A forward-looking agenda, Final report, April 2009, Quality Assurance Agency for Higher Education available at, accessed 8 February 2011, not available 2 February 2012.

Vlãsceanu, L., Grünberg, L., and Pârlea, D., 2004, Quality Assurance and Accreditation: A Glossary of Basic Terms and Definitions (Bucharest, UNESCO-CEPES) Papers on Higher Education, ISBN 92-9069-178-6, available at, accessed 20 September 2012, still available 31 December 2016.

Vlãsceanu, L.,  Grünberg, L., and Pârlea, D., 2007,  Quality Assurance and Accreditation: A Glossary of Basic Terms and Definitions ( Bucharest , UNESCO-CEPES) Revised and updated edition. ISBN 92-9069-186-7. Available at, accessed 31 December 2016.

Zairi, M. & Leonard, P., 1994, Practical Benchmarking. The Complete Guide. Chapman & Hall. United Kingdom.

copyright Lee Harvey 2004–2017

A NOVEL Who bombed a Birmingham mosque?


A B C D E F G H I J K L M N O P Q R S T U V W X Y Z Home