Search Results

You are looking at 1 - 4 of 4 items for

  • Author or Editor: Martijn S. Visser x
  • Refine by Access: All Content x
Clear All Modify Search

Summary  

This paper reports the first results of the extension of citation analysis to 'non-source' items, which is one strand of an extensive study of quantitative performance indicators used in the assessment of research. It would be presumptuous to draw firm conclusions from this first foray into the realm of non-source citations, however our analysis is based on an extensive experimental database of over 30,000 publications, so the results can be viewed as strong pointers to possible generalised outcomes. We show that it is possible to mine ISI databases for references to a comprehensive oeuvre of items from whole institutions. Many types of publications are visible in the ISI data - books, book chapters, journals not indexed by ISI, and some conference publications. When applied to the assessment of university departments, they can have a significant effect on rankings, though this does not follow in all cases. The investment of time, effort, and money in a significantly extended analysis will not be equally beneficial in all fields. However, a considerable amount of testing is required to confirm our initial results.

Restricted access

Abstract

We applied a set of standard bibliometric indicators to monitor the scientific state-of-arte of 500 universities worldwide and constructed a ranking on the basis of these indicators (Leiden Ranking 2010). We find a dramatic and hitherto largely underestimated language effect in the bibliometric, citation-based measurements of research performance when comparing the ranking based on all Web of Science (WoS) covered publications and on only English WoS covered publications, particularly for Germany and France.

Open access
Scientometrics
Authors:
Ludo Waltman
,
Nees Jan van Eck
,
Thed N. van Leeuwen
,
Martijn S. Visser
, and
Anthony F. J. van Raan

Abstract

We present an empirical comparison between two normalization mechanisms for citation-based indicators of research performance. These mechanisms aim to normalize citation counts for the field and the year in which a publication was published. One mechanism is applied in the current so-called crown indicator of our institute. The other mechanism is applied in the new crown indicator that our institute is currently exploring. We find that at high aggregation levels, such as at the level of large research institutions or at the level of countries, the differences between the two mechanisms are very small. At lower aggregation levels, such as at the level of research groups or at the level of journals, the differences between the two mechanisms are somewhat larger. We pay special attention to the way in which recent publications are handled. These publications typically have very low citation counts and should therefore be handled with special care.

Open access
Scientometrics
Authors:
Ludo Waltman
,
Nees Jan van Eck
,
Thed N. van Leeuwen
,
Martijn S. Visser
, and
Anthony F. J. van Raan

Abstract

Opthof and Leydesdorff (Scientometrics, 2011) reanalyze data reported by Van Raan (Scientometrics 67(3):491–502, 2006) and conclude that there is no significant correlation between on the one hand average citation scores measured using the CPP/FCSm indicator and on the other hand the quality judgment of peers. We point out that Opthof and Leydesdorff draw their conclusions based on a very limited amount of data. We also criticize the statistical methodology used by Opthof and Leydesdorff. Using a larger amount of data and a more appropriate statistical methodology, we do find a significant correlation between the CPP/FCSm indicator and peer judgment.

Open access