View More View Less
  • 1 National Research Council of Italy, Rome, Italy
  • 2 Department of Management, Laboratory for Studies of Research and Technology Transfer, School of Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133, Rome, Italy, Dangelo@disp.uniroma2.it
Restricted access

Abstract

National research assessment exercises are becoming regular events in ever more countries. The present work contrasts the peer-review and bibliometrics approaches in the conduct of these exercises. The comparison is conducted in terms of the essential parameters of any measurement system: accuracy, robustness, validity, functionality, time and costs. Empirical evidence shows that for the natural and formal sciences, the bibliometric methodology is by far preferable to peer-review. Setting up national databases of publications by individual authors, derived from Web of Science or Scopus databases, would allow much better, cheaper and more frequent national research assessments.

  • Abramo, G, D'Angelo, CA 2011 National-scale research performance assessment at the individual level. Scientometrics 86 2 347364 .

  • Abramo, G, D'Angelo, CA, Pugini, F 2008 The measurement of Italian Universities’ research productivity by a non parametric-bibliometric methodology. Scientometrics 76 2 225244 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Abramo, G, D'Angelo, CA, Caprasecca, A 2009 Allocative efficiency in public research funding: can bibliometrics help?. Research Policy 38 1 206215 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Abramo, G, D'Angelo, CA, Viel, F 2010 Peer review research assessment: a sensitivity analysis of performance rankings to the share of research product evaluated. Scientometrics 85 3 705720 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Adams, J, Griliches, Z 1998 Research productivity in a system of universities. Annales d'economie et de statistique 4950:127162.

  • Aksnes, DW 2008 When different persons have an identical author name. How frequent are homonyms?. Journal of the American Society for Information Science and Technology 59 5 838841 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Aksnes, DW, Taxt, RE 2004 Peers reviews and bibliometric indicators: a comparative study at Norvegian University. Research Evaluation 13 1 3341 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Butler, L, McAllister, I 2007 Metrics or peer review? Evaluating the 2001 UK Research assessment exercise in political science. Political Studies Review 7:317 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • D'Angelo, CA, Giuffrida, C, Abramo, G 2011 A heuristic approach to author name disambiguation in large-scale bibliometric databases. Journal of the American Society for Information Science and Technology 62 2 257269 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Garfield, E 1980 Premature discovery or delayed recognition—Why?. Current Contents 21:510.

  • Glanzel, W., 2008. Seven myths in bibliometrics. About facts and fiction in quantitative science studies. In Kretschmer & F. Havemann (eds.), Proceedings of WIS Fourth international conference on webometrics, informetrics and scientometrics and ninth COLLNET meeting, Berlin.

    • Search Google Scholar
    • Export Citation
  • Harman, G 2000 Allocating research infrastructure grants in post-binary higher education systems: British and Australian approaches. Journal of Higher Education Policy and Management 22 2 111126 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Horrobin, DF 1990 The philosophical basis of peer review and the suppression of innovation. Journal of the American Medical Association 263 10 14381441 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lach, S., & Schankerman, M., 2003. Incentives and invention in universities, National Bureau of Economic Research working paper 9727. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1158310.

    • Crossref
    • Export Citation
  • Moed, HF 2005 Citation analysis in research evaluation Springer Dordrecht.

  • Moxham, H., Anderson, J., 1992. Peer review. A view from the inside. Science and Technology Policy, 715.

  • Oppenheim, C 1997 The correlation between citation counts and the 1992 research assessment exercise ratings for British research in genetics, anatomy and archaeology. Journal of Documentation 53 5 477487 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Oppenheim, C, Norris, M 2003 Citation counts and the research assessment exercise V: archaeology and the 2001 RAE. Journal of Documentation 56 6 709730.

    • Search Google Scholar
    • Export Citation
  • Pendlebury, DA 2009 The use and misuse of journal metrics and other citation indicators. Scientometrics 57 1 111.

  • REF (Research Excellence Framework). 2009. http://www.hefce.ac.uk/pubs/hefce/2009/09_38/#exec. Accessed 21 Jan 2011.

  • Rinia, EJ Th N van Leeuwen HG van Vuren AFJ van Raan 1998 Comparative analysis of a set of bibliometric indicators and central peer review criteria, evaluation of condensed matter physics in The Netherlands. Research Policy 27:95107 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • AFJ van Raan 2008 Scaling rules in the science system: influence of field-specific citation characteristics on the impact of research groups. Journal of the American Society for Information Science and Technology 59 4 565576 .

    • Crossref
    • Search Google Scholar
    • Export Citation