Clustering algorithms are used prominently in co-citation analysis by analysts aiming to reveal research streams within a
field. However, clustering of widely cited articles is not robust to small variations in citation patterns. We propose an
alternative algorithm, dense network sub-grouping, which identifies dense groups of co-cited references. We demonstrate the algorithm using a data set from the field of family
business research and compare it to two alternative methods, multidimensional scaling and clustering. We also introduce a
free software tool, Sitkis, that implements the algorithm and other common bibliometric methods. The software identifies journal-,
country- and university-specific citation patterns and co-citation groups, enabling the identification of “invisible colleges.”
Citation analysis for evaluative purposes requires reference standards, as publication activity and citation habits differ
considerably among fields. Reference standards based on journal classification schemes are fraught with problems in the case
of multidisciplinary and general journals and are limited with respect to their resolution of fields. To overcome these shortcomings
of journal classification schemes, we propose a new reference standard for chemistry and related fields that is based on the
sections of the Chemical Abstracts database. We determined the values of the reference standard for research articles published in 2000 in the biochemistry
sections of Chemical Abstracts as an example. The results show that citation habits vary extensively not only between fields but also within fields. Overall, the sections of Chemical Abstracts seem to be a promising basis for reference standards in chemistry and related fields for four reasons: (1) The wider coverage
of the pertinent literature, (2) the quality of indexing, (3) the assignment of papers published in multidisciplinary and
general journals to their respective fields, and (4) the resolution of fields on a lower level (e.g. mammalian biochemistry)
than in journal classification schemes (e.g. biochemistry & molecular biology).
Authors:Ruimin Ma, Qiangbin Dai, Chaoqun Ni, and Xuelu Li
Author co-citation analysis (ACA) is an important method for discovering the intellectual structure of a given scientific
field. Since traditional ACA was confined to ISI Web of Knowledge (WoK), the co-citation counts of pairs of authors mainly
depended on the data indexed in WoK. Fortunately, Google Scholar has integrated different academic databases from different
publishers, providing an opportunity of conducting ACA in wider a range. In this paper, we conduct ACA of information science
in China with the Chinese Google Scholar. Firstly, a brief introduction of Chinese Google Scholar is made, including retrieval
principles and data formats. Secondly, the methods used in our paper are given. Thirdly, 31 most important authors of information
science in China are selected as research objects. In the part of empirical study, factor analysis is used to find the main
research directions of information science in China. Pajek, a powerful tool in social network analysis, is employed to visualize
the author co-citation matrix as well. Finally, the resemblances and the differences between China and other countries in
information science are pointed out.
Authors:Juan Gorraiz, Christian Gumpenberger, and Martin Wieland
criteria focusing on the “best” (higher impact) literature, due to its interdisciplinary character and its comprehensive coverage (“Century of Science”).
The citationanalysis was conducted using the “Cited reference search” feature in WoS
Bibliometrics does not allow prediction of the duration of research fronts. Utilizing an analogy with the concept of adaptive radiation, this heuristic article suggests a technique which may permit a measure of predictability to bibliometrics.
We investigated committee peer review for awarding long-term fellowships to post-doctoral researchers as practiced by the
Boehringer Ingelheim Fonds (B.I.F.) - a foundation for the promotion of basic research in biomedicine. Assessing the validity
of selection decisions requires a generally accepted criterion for research impact. A widely used approach is to use citation
counts as a proxy for the impact of scientific research. Therefore, a citation analysis for articles published previous to
the applicants' approval or rejection for a B.I.F. fellowship was conducted. Based on our model estimation (negative binomial
regression model), journal articles that had been published by applicants approved for a fellowship award (n = 64) prior to applying for the B.I.F. fellowship award can be expected to have 37% (straight counts of citations) and 49%
(complete counts of citations) more citations than articles that had been published by rejected applicants (n = 333). Furthermore, comparison with international scientific reference values revealed (a) that articles published by successful
and non-successful applicants are cited considerably more often than the “average” publication and (b) that excellent research
performance can be expected more of successful than non-successful applicants. The findings confirm that the foundation is
not only achieving its goal of selecting the best junior scientists for fellowship awards, but also successfully attracting
highly talented young scientists to apply for B.I.F. fellowships.
The application of methods of quantitative analysis makes it possible to evaluate the impact of scientific journals on one another. These methods are used to determine the significance of similar scientific journals by their cross-citations, taking into account data from theJournal Citation Reports (JCR). They also help to improve theJournal Citation Reports structure and widen its uses for the evaluation of scientific journals. The above methods are applied to analyse critically the principles of ranking journals in package 1 and the tabular contents ofJCR's packages 2 and 3, as well as to study frequency distributions of the journals both in time and space.
Authors:Ludo Waltman, Nees Jan van Eck, Thed N. van Leeuwen, Martijn S. Visser, and Anthony F. J. van Raan
make a more general remark on the comparison of citationanalysis and peer review.
The analysis of Van Raan ( 2006 ) is based on an assessment study of Dutch chemistry and chemical engineering research groups conducted