Search Results

You are looking at 1 - 2 of 2 items for

  • Author or Editor: W. Burger x
  • All content x
Clear All Modify Search

Abstract  

A comparison is made between two types of research past performance analysis: the results of bibliometric-indicators and the results of peer judgement. This paper focuses on two case studies: the work of Dutch National Survey Committees on Chemistry and on Biology, both compared with our bibliometric results for research groups in these disciplines at the University of Leiden. The comparison reveals a serious lack of agreement between the two types of past performance analysis. This important, science-policy relevant observation is discussed in this paper.

Restricted access

Abstract  

An analysis of three major problems in the application of bibliometric research performance indicators is made in three separate sections. In the first section, the influence of field-dependent citation practices is analysed. The results indicate that rankings of publications from different fields, based on citation counts, can be affected seriously by differences between citation characteristics in those fields. If certain assumptions hold, one should expect high (short term) citation levels in Biochemistry, Celbiology and Biophysics. Medium citation levels are to be expected in Experimental and Molecular Physics, Physical and Organic Chemistry, Pharmacology and Plant Physiology, and low citation levels in Mathematics, Taxonomy, Pharmacognosy and Inorganic Solid State Chemistry. In the second section time-dependent factors are studied. It is shown that trend-analyses of output and impact based on bibliometric scores can be disturbed by changes in theSCI-database and in publication and citation practices. One of the disturbing factors is shown to be the inclusion of so called Books into theSCI data-base in 1977. Finally, in the third section a case is presented which illustrates the consequences of operating on incomplete bibliometric data in the evaluation of scientific performance. A completeness percentage of 99% for publication data is proposed as a standard in evaluations of the performance of small university research groups).

Restricted access