View More View Less
  • 1 National Research Council of Italy, Rome, Italy
  • 2 Laboratory for Studies of Research and Technology Transfer, School of Engineering, Department of Management, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133, Rome, Italy
Restricted access

Abstract

National research evaluation exercises provide a comparative measure of research performance of the nation's institutions, and as such represent a tool for stimulating research productivity, particularly if the results are used to inform selective funding by government. While a school of thought welcomes frequent changes in evaluation criteria in order to prevent the subjects evaluated from adopting opportunistic behaviors, it is evident that the “rules of the game” should above all be functional towards policy objectives, and therefore be known with adequate forewarning prior to the evaluation period. Otherwise, the risk is that policy-makers will find themselves faced by a dilemma: should they reward universities that responded best to the criteria in effect at the outset of the observation period or those that result as best according to rules that emerged during or after the observation period? This study verifies if and to what extent some universities are penalized instead of rewarded for good behavior, in pursuit of the objectives of the “known” rules of the game, by comparing the research performances of Italian universities for the period of the nation's next evaluation exercise (2004–2008): first as measured according to criteria available at the outset of the period and next according to those announced at the end of the period.

  • Abramo, G, D'Angelo, CA 2011 National-scale research performance assessment at the individual level. Scientometrics 86 2 347364 .

  • Abramo, G, D'Angelo, CA, Caprasecca, A 2009 Allocative efficiency in public research funding: can bibliometrics help?. Research Policy 38 1 206215 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Abramo G. , D'Angelo C.A., & Cicero T. (2011). The dispersion of research performance within and between universities as an indicator of the competitive intensity in higher education systems, working paper LabRTT, a short abstract available on http://www.disp.uniroma2.it/laboratorioRTT/TESTI/Working%20paper/RESPOL_Cicero.pdf.

    • Search Google Scholar
    • Export Citation
  • Abramo, G, D'Angelo, CA, Pugini, F 2008 The measurement of Italian universities’ research productivity by a non parametric–bibliometric methodology. Scientometrics 76 2 225244 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Aksnes, DW, Taxt, RE 2004 Peers reviews and bibliometric indicators: a comparative study at Norwegian University. Research Evaluation 13 1 3341 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Butler, L 2003 Explaining Australia's increased share of ISI publications. The effects of a funding formula based on publication counts. Research Policy 32 1 143155 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • D'Angelo, CA, Giuffrida, C, Abramo, G 2011 A heuristic approach to author name disambiguation in large-scale bibliometric databases. Journal of the American Society for Information Science and Technology 62 2 257269 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Debackere, K, Glänzel, W 2004 Using a bibliometric approach to support research policy making: The case of the Flemish BOF-key. Scientometrics 59 2 253276 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Geuna, A, Martin, BR 2003 University research evaluation and funding: an international comparison. Minerva 41 4 277304 .

  • Gläser, J. (2007). The social orders of research evaluation systems. In R. Whitley & J. Gläser (eds.), The changing governance of sciences (pp. 245264). Dordrecht: Springer.

    • Search Google Scholar
    • Export Citation
  • Gómez, I, Bordons, M, Fernández, MT, Morillo, F 2009 Structure and research performance of Spanish universities. Scientometrics 79 1 131146 .

  • Kao, C, Pao, HL 2009 An evaluation of research performance in management of 168 Taiwan universities. Scientometrics 78 2 261277 .

  • Laudel, G 2006 The art of getting funded: how scientists adapt to their funding conditions. Science and Public Policy 33 7 489504 .

  • Liefner, I 2003 Funding, resource allocation, and performance in higher education systems. Higher Education 46 4 469489 .

  • Moed, HF 2008 UK research assessment exercises: Informed judgments on research quality or quantity?. Scientometrics 74 1 153161 .

  • Moed, HF 2009 New developments in the use of citation analysis in research evaluation. Archivum Immunologiae et therapiae Experimentalis 57 1 1318 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Oppenheim, C, Norris, M 2003 Citation counts and the research assessment exercise V: Archaeology and the 2001 RAE. Journal of Documentation 56 6 709730.

    • Search Google Scholar
    • Export Citation
  • Pendlebury, DA 2009 The use and misuse of journal metrics and other citation indicators. Archivum Immunologiae et therapiae Experimentalis 57 1 111 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • RAE. (2008). Research Assessment Exercise, www.rae.ac.uk. Accessed 2 Feb 2011.

  • REF. (2011). Research Excellence Framework. http://www.hefce.ac.uk/Research/ref/. Accessed 2 Feb 2011.

  • Rinia, EJ Th N van Leeuwen HG van Vuren AFJ Van Raan 1998 Comparative analysis of a set of bibliometric indicators and central peer review criteria, Evaluation of condensed matter physics in the Netherlands. Research Policy 27:95107 .

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Rousseau, R, Smeyers, M 2000 Output-financing at LUC. Scientometrics 47 2 379387 .

  • VQR. (2011). Linee guida VQR 2004-2008, http://www.civr.miur.it/vqr_decreto.html. Accessed 2 Feb 2011.

  • VTR (2006). Italian Triennial Research Evaluation. VTR 2001-2003. Risultati delle valutazioni dei Panel di Area. http://vtr2006.cineca.it/. Accessed 2 Feb 2011.

    • Search Google Scholar
    • Export Citation

Manuscript submission: http://www.editorialmanager.com/scim/

  • Impact Factor (2019): 2.867
  • Scimago Journal Rank (2019): 1.210
  • SJR Hirsch-Index (2019): 106
  • SJR Quartile Score (2019): Q1 Computer Science Apllications
  • SJR Quartile Score (2019): Q1 Library and Information Sciences
  • SJR Quartile Score (2019): Q1 Social Sciences (miscellaneous)
  • Impact Factor (2018): 2.770
  • Scimago Journal Rank (2018): 1.113
  • SJR Hirsch-Index (2018): 95
  • SJR Quartile Score (2018): Q1 Library and Information Sciences
  • SJR Quartile Score (2018): Q1 Social Sciences (miscellaneous)

For subscription options, please visit the website of Springer