Author: R. Kostoff 1
View More View Less
  • 1 Office of Naval Research 800 N. Quincy St. 22217 Arlington Va (USA) 800 N. Quincy St. 22217 Arlington Va (USA)
Restricted access

Abstract  

The present paper addresses some of the many possible uses of citations, including bookmark, intellectual heritage, impact tracker, and self-serving purposes. The main focus is on the applicability of citation analysis as an impact or quality measure. If a paper's bibliography is viewed as consisting of a directed (research impact or quality) component related to intellectual heritage and random components related to specific self-interest topics, then for large numbers of citations from many different citing paper, the most significant intellectual heritage (research impact or quality) citations will aggregate and the random author-specific self-serving citations will be scattered and not accumulate. However, there are at least two limitations to this model of citation analysis for stand-alone use as a measure of research impact of quality. First, the reference to intellectual heritage could be positive or negative. Second, there could be systemic biases which affect the aggregate results, and one of these, the “Pied Piper Effect”, is described in detail. Finally, the results of a short citation study comparing Russian and American papers in different technical fields are presented. The questions raised in interpreting this data highlight a few of the difficulties in attempting to interpret citation results without supplementary information. Leydesdorff (Leydesdorff, 1998) addresses the history of citations and citation analysis, and the transformation of a reference mechanism into a purportedly quantitive measure of research impact/quality. The present paper examines different facets of citations and citation analysis, and discusses the validity of citation analysis as a useful measure of research impact/quality.