Derek John de Solla Price died on September 3, 1983. The loss of this exciting and dynamic man is one which is felt not just by his friends, but by the scientific community as a whole. This article was originally planned as part of an essay forCurrent Contents® (CC®).1 But I was delighted by the opportunity to contribute it to this special tribute issue ofScientometrics.
Theories of citation are as elusive as theories of information science, which have been debated for decades. But as a basis
for discussion I offer the term citationology as the theory and practice of citation, including its derivative disciplines
citation analysis and bibliometrics Several maxims, commandments if you will, have been enunciated. References are the results
of a specialized symbolic language with a citation syntax and grammar. References, like words, have multiple meanings which
are related to the aposteriori quality of citation indexes. Therefore, citation relevance cannot be predicted. Mathematical
microtheories in bibliometrics abound, including the apposite laws of scattering and concentration. Citation behavior is a
vast sub-set of citation theory, which like citation typology, can never be complete. Deviant citation behavior preoccupies
certain authors but it is rarely significant in well-designed citation analyses, where proper cohorts are defined. Myths about
uncitedness and the determinants of impact are discussed, as well as journal impact factors as surrogates and observations
on scientists of Nobel Class.
After two years at Johns Hopkins investigating “machine documentation,” and another year as a student of library science,
I became, fortuitously, a documentation consultant. By 1954, I called myself an information engineer, which was an apt description
of my professional consulting activities. However, Pennsylvania licensing law requires that engineers be graduates of engineering
schools. So I became an information scientist! I've never thought of myself as an information theoretician and have been skeptical
about a need for a theory of information science. I've practiced information science and engineering without explicit theoretical
support. But undoubtedly there are underlying principles which can guide information scientists who, like myself, could be
called “citationists” or “citationologists”. If there is a theory and practice of citation, it should probably be called citationology.
A comprehensive discussion on the use of citation analysis to rate scientific performance and the controversy surrounding it. The general adverse criticism that citation counts include an excessive number of negative citations (citations to incorrect results worthy of attack), self-citations (citations to the works of the citing authors), and citations to methodological papers is analyzed. Included are a discussion of measurement problems such as counting citations for multiauthored papers, distinguishing between more than one person with the same last name (homographs), and what it is that citation analysis actually measures. It is concluded that as the scientific enterprise becomes larger and more complex, and its role in society more critical, it will become more difficult, expensive and necessary to evaluate and identify the largest contributors. When properly used, citation analysis can introduce a useful measure of objectivity into the evaluation process at relatively low financial cost.
An evaluation exercise was performed involving 313 papers of research staff (66 persons) of the Deutsche Rheuma-Forschungszentrum (DRFZ) published in 2004–2008. The records and citations to them were retrieved from the Web of Science (Thomson Reuters) in March 2010. The authors compared productivity and citedness of “group leaders” vs. “regular scientists”, of “male scientists” vs. “female scientists” using citation-based indexes. It was found that “group leaders” are more prolific and cited more often than “regular scientists”, the same is true considering “male” vs. “female scientists”. The greatest contrast is observed between “female leaders” and “female regular scientists”. The above mentioned differences are significant in indexes related to the number of papers, while values of indexes characterizing the quality of papers (average citation rate per paper and similar indexes) are not substantially different among the groups compared. The mean value of percentile rank index for all the 313 papers is 58.5, which is significantly higher than the global mean value of about 50. This fact is evidence of a higher citation status, on average, of the publications from the DRFZ.