Social Research Glossary
Citation reference: Harvey, L., 2012-17, Social Research Glossary, Quality Research International, http://www.qualityresearchinternational.com/socialresearch/
This is a dynamic glossary and the author would welcome any e-mail suggestions for additions or amendments. Page updated 2 January, 2017 , © Lee Harvey 2012–2017.
|A fast-paced novel of conjecture and surprises|
Citation analysis is a technique used in the sociology and philosophy of science as a means of identifying communicating networks of scientists.
The procedure involves logging the citations that occur in scientific papers in order to build up a picture of a network through the frequency of cross-references to network members work.
Citation analysis has increasingly (since the 1980s) been used as a measure of esteem of researchers (in all disciplines). The process is linked to evaluations of research (usually for funding purposes) and the numbers of citations of a work is presumed to be a proxy for the value or worth of the cited work.
Garfield, E. (1979, p. 240) wrote the following but concluded that citation anlysis was a useful tool for evaluating scientific work:
The use of citation analysis to produce measures, or indicators, of scientific per- formance has generated a considerable amount of discussion (l-lo). Not surpris- ingly, the discussion grows particularly intense when the subject is the use of these measures to evaluate people, either as individuals or in small formal groups, such as departmental faculties in academic institutions. Published descriptions of how cita- tion analysis is being used to define the history of scientific development, or to measure the activity and interaction of scientific specialties, generate relatively little comment from the scientific community at large. And what is generated tends to be calm and reasoned. In contrast, any mention of using citation analysis to measure the performance of specific individuals or groups produces an automatic, and often heatedly emotional, response from the same people who otherwise remain silent.
MacRoberts & MacRoberts (2010, pp 5–7) in a paper that investigated scientific work indicated no value in citation analysis:
Our approach, unlike traditional citation analysis, does not begin and end with lists of citations but goes to the text and beyond to determine what the influences on scientific work actually are. This approach produces a very different dataset (and understanding) than that produced by examining only citations. Thus, we find that the articles (i.e., works) used for biogeographical databases are not only read but are redacted and their data repeatedly used. But they are not cited. Let us make this point clear: We are not talking here about arti- cles in the journals not monitored by the Thomson Reuters but instead about the articles in the journals monitored by Thomson Reuters: the so-called “top 10% of journals,” jour- nals such as those from which our examples largely come: Science, Nature, American Naturalist, Ecology, Journal of Biogeography, Annals of Botany, Systematic Zoology, Ecol- ogy Letters, Applied Vegetation Science, Castanea, Great Basin Naturalist, and so on. We are talking about the thou- sands of floras and faunas, atlases, millions of herbarium and zoological specimens, thousands of unnamed fieldworkers, and unnamed persons consulted, the information synthesized into massive databases, the data collected by the Natural Heritage Programs, the data amassed by the USDA Plants database, and so on, that are used but not cited. We are talk- ing about the literature not monitored by Thomson Reuters: the nonprestigious literature, the grey literature, the “notes” published by botanists and zoologists that describe range extensions, master’s theses, and birdwatchers’ distribution reports. This is information used (but not cited) in articles published in the journals scanned by Thomson Reuters. These workers and works are invisible to citation analysts who rely on standard citation databases.....
It seems to some that we are setting up a straw man, but this is not the case. Garfield (1997) believed that the question is “Do scientific articles cite most of the relevant articles that led up to the current work?” Our answer, of course, is a resounding no (Greenberg, 2009; M.H. MacRoberts, 1997). In the present case, without the data articles, sine qua non. But if all one wants to know is who is cited and how often in the journals monitored by Thomson Reuters, then turn to Thomson Reuters. But if one wants to know who contributes to science and how information is used and moves through the system, then another course is necessary. This will not involve “grand narratives and large-scale number crunching” (Cronin, 2005, p. 1505), but instead research on what goes on at the lab bench, what scientists do as they work and interact with colleagues, what they read, how they develop their data, and how they construct their articles within the culture of their disciplines (for references, see Hicks & Potter, 1991; M.H. MacRoberts & MacRoberts, 1996; also see Greenberg, 2009). As Nobel laureate Peter Medawar (1969) stated, “it is no use looking to scientific ‘papers,’ for they not merely conceal but actively misrepresent the reasoning that goes into the work they describe. Only unstudied evidence will do—and that means listening at the keyhole” (p. 32).
Cronin, B., 2005, 'A hundred million acts of whimsy?' Current Science, 89, pp. 1505–1509.
Garfield, E., 1979, Citation Indexing-Its Theory and Application in Science, Technology, and Humanities, New York, Wiley, available at http://www.garfield.library.upenn.edu/ci/title.pdf, accessed 1 February 201, still available 14 December 20163.
Garfield, E., 1997, 'Validation of citation analysis', Journal of the American Society of Information Science, 48, p. 962.
Greenberg, S.A., 2009, 'How citation distortions create unfounded authority: Analysis of a citation network', British Medical Journal, 339, available at http://www.bmj.com/cgi/content/full/339/ jul20_3/b2680, accessed 27 September 2009, page not available 14 December 2016.
Hicks, D. and Potter, J., 1991, 'Sociology of scientific knowledge: a reflexive citation analysis of science disciplines and disciplining science', Social Studies of Science, 21, pp. 459–501.
MacRoberts, M.H., 1997, 'Rejoinder', Journal of the American Society for Information Science, 48, p. 963.
MacRoberts, M.H., & MacRoberts, B.R., 1996, 'Problems of citation analysis', Scientometrics, 36, pp. 435–44.
MacRoberts, M.H., & MacRoberts, B.R., 2010, 'Problems of citation analysis: a study of uncited and seldom-cited influences', Journal of the American Society for Information Science and Technology, 61(1), pp. 1–13
Medawar, P., 1969, The art of the soluble. London: Harmondsworth.
copyright Lee Harvey 2012–2017
copyright Lee Harvey 2012–2017