With reference to social constructivist approaches on citing behavior in the sciences, the hypothesis of acceleration of citing
behavior after the millennium was empirically tested for a stratified random sample of exemplary psychology journal articles.
The sample consists of 45 English and 45 German articles published in the years 1985 versus 1995 versus 2005 in high impact
journals on developmental psychology, psychological diagnosis and assessment, and social psychology. Content analyses of the
reference lists refer to the total number of references cited in the articles and the publication years of all references.
In addition, the number of self-references, the number of pages, and the number of authors were determined for each article.
Results show that there is no acceleration of citing behavior; rather, on the contrary, a significant trend is revealed for
an increase in authors’ citing somewhat older references in the newer journal articles. Significant main effects point also
at more citations of somewhat older references in the English (vs. German) journal articles as well as in articles on social
psychology and psychological diagnosis (vs. on developmental psychology). Complementary analyses show that multiple authorships
and the number of pages as well as the total number of references and the number of self-references increase significantly
with time. However, percentage of self-references remains quite stable at about 10%. Some methodological and statistical traps
in bibliometric testing the starting hypothesis are considered. Thus, the talk that has been circulating among psychology
colleagues and students on the potential millennium effects on citing behavior in the sciences (which can, however, become
a self-fulfilling prophecy) are not confirmed—at least for psychology journals.
In reference to an exemplary bibliometric publication and citation analysis for a University Department of Psychology, some
general conceptual and methodological considerations on the evaluation of university departments and their scientists are
presented. Data refer to publication and citation-by-others analyses (PsycINFO, PSYNDEX, SSCI, and SCI) for 36 professorial
and non-professorial scientists from the tenure staff of the department under study, as well as confidential interviews on
self-and colleagues-perceptions with seven of the sample under study. The results point at (1) skewed (Pareto-) distributions
of all bibliometric variables demanding nonparametrical statistical analyses, (2) three personally identical outliers which
must be excluded from some statistical analyses, (3) rather low rank-order correlations of publication and citation frequencies
having approximately 15% common variance, (4) only weak interdependences of bibliometric variables with age, occupational
experience, gender, academic status, and engagement in basic versus applied research, (5) the empirical appropriateness and
utility of a normative typological model for the evaluation of scientists’ research productivity and impact, which is based
on cross-classifications with reference to the number of publications and the frequency of citations by other authors, and
(6) low interrater reliabilities and validity of ad hoc evaluations within the departments’ staff. Conclusions refer to the utility of bibliometric data for external peer reviewing
and for feedback within scientific departments, in order to make colleague-perceptions more reliable and valid.
Authors:Günter Krampen, Alexander von Eye and Gabriel Schui
Bibliometric data on psychology publications from 1977 through 2008 are modeled and forecasted for the 10 years following 2008. Data refer to the raw frequencies of the PsycINFO (94% English-language, mainly Anglo-American publications) and the English-language documents of PSYNDEX (publications from the German-speaking countries). The series were modelled by way of exponential smoothing. In contrast to Single Moving Average methods which do not weigh observations, exponential smoothing assigns differential weights to observations. Weights reflect the distance from the most recent data point. Results suggest strongly expanding publication activities which can be represented by exponential functions. In addition, forecasted publication activities, estimated based on psychology publication frequencies in the past, show positive bibliometric trends in the Anglo-American research community. These trends go in parallel the bibliometric trends for the English-language publications of German-speaking authors. However, while positive trends were forecasted for all psychological subdisciplines of the Anglo-American publication database PsycINFO, negative bibliometric trends were estimated for English-language publications from German-speaking authors in 6 out of 20 subdisciplines.
Authors:Günter Krampen, Ralf Becker, Ute Wahner and Leo Montada
In reference to the increasing significance of citation counting in evaluations of scientists and science institutes as well
as in science historiography, it is analyzed empirically what is cited in which frequency and what types of citations in scientific
texts are used. Content analyses refer to numbers of references, self-references, publication language of references cited,
publication types of references cited, and type of citation within the texts. Validity of citation counting is empirically
analyzed with reference to random samples of English and German journal articles as well as German textbooks, encyclopedias,
and test-manuals from psychology. Results show that 25% of all citations are perfunctory, more than 50% of references are
journal articles and up to 40% are books and book-chapters, 10% are self-references. Differences between publications from
various psychological sub-disciplines, publication languages, and types of publication are weak. Thus, validity of evaluative
citation counting is limited because at least one quarter refers to perfunctory citations exhibiting a very low information
utility level and by the fact that existing citation-databases refer to journal articles only.