Search Results
In this paper we present characteristics of the statistical correlation between the Hirsch (h-) index and several standard bibliometric indicators, as well as with the results of peer review judgment. We use the results of a large evaluation study of 147 university chemistry research groups in the Netherlands covering the work of about 700 senior researchers during the period 1991-2000. Thus, we deal with research groups rather than individual scientists, as we consider the research group as the most important work floor unit in research, particularly in the natural sciences. Furthermore, we restrict the citation period to a three-year window instead of 'life time counts' in order to focus on the impact of recent work and thus on current research performance. Results show that the h-index and our bibliometric 'crown indicator' both relate in a quite comparable way with peer judgments. But for smaller groups in fields with 'less heavy citation traffic' the crown indicator appears to be a more appropriate measure of research performance.
J. Inform. 2007 1 26 34 Schubert, A.: Hirsch-index folyóiratokra. 2006, http
Tudománymetriai újdonságok. Vége az impakt faktor egyeduralmának?
New bibliometric indicators. Is this the end of the impact factor era?
. 16. Schreiber M. Self-citation corrections for the Hirsch index . Accepted for publication in Europhys Lett, arXiv:physics/0701231v2 [physics.soc-ph] 16 Mar
situations there are other, more useful methods, e.g. total number of publications, complete citations or the Hirsch Index (Hirsch 2005 ). Detailed analysis confirms that in history, for frequently-cited authors, all measured indicators are closely
Abstract
This article introduces the generalized Kosmulski-indices as a new family of scientific impact measures for ranking the output of scientific researchers. As special cases, this family contains the well-known Hirsch-index h and the Kosmulski-index h (2). The main contribution is an axiomatic characterization that characterizes every generalized Kosmulski-index in terms of three axioms.
Tudománymetria: Mit Mérünk, Mit Hasonlíthatunk Össze És Mivel? •
A régészek képe a tudományos referencia adatbázis tükrében
Scientometrics: What do we measure, what is Compared, and to what can we compare? •
The image of Archaeologists in the light of the Scientific Reference Database
A 2020 tavaszán nyilvánossá vált referencia-adatbázis három paraméter, az idézetek évenkénti átlaga, a Hirsch-index és a Q1-es folyóiratokban megjelent publikációk száma alapján rangsorolja a kutatókat. Az összesített rangsorban a harmadik paraméter kétszeres súllyal szerepel. A cikk a régészek és az ókori kelettel foglalkozó kutatók nyilvános adatainak felhasználásával végzett teszt eredményeit ismerteti, és a felmerülő problémákra alternatív megoldások lehetőségét veti fel.
The reference database “Scientometrics of Hungarian Researchers”, published in spring 2020, classifies researchers on the basis of three parameters: the annual average of citations, the Hirsch index, and the number of publications in Q1 journals. Presented here are the results of a test using the public data of archaeologists and researchers of the Ancient Near East, a review of the specific problems faced by humanities researchers, and a call for a discussion to find a more appropriate set of parameters that would better fit the specifics of the humanities.
Abstract
This study applies Prathap’s approach to successive h-indices in order to measure the influence of researcher staff on institutional impact. The twelve most productive Cuban institutions related to the study of the human brain are studied. The Hirsch index was used to measure the impact of the institutional scientific output, using the g-index and R-index as complementary indicators. Prathap’s approach to successive h-indices, based on the author-institution hierarchy, is used to determine the institutional impact through the performance of the researcher staff. The combination of different Hirsch-type indices for institutional evaluation is illustrated.
Abstract
We have developed a method to obtain robust quantitative bibliometric indicators for several thousand scientists. This allows us to study the dependence of bibliometric indicators (such as number of publications, number of citations, Hirsch index...) on the age, position, etc. of CNRS scientists. Our data suggests that the normalized h-index (h divided by the career length) is not constant for scientists with the same productivity but different ages. We also compare the predictions of several bibliometric indicators on the promotions of about 600 CNRS researchers. Contrary to previous publications, our study encompasses most disciplines, and shows that no single indicator is the best predictor for all disciplines. Overall, however, the Hirsch index h provides the least bad correlations, followed by the number of papers published. It is important to realize however that even h is able to recover only half of the actual promotions. The number of citations or the mean number of citations per paper are definitely not good predictors of promotion. Due to space constraints, this paper is a short version of a more detailed article. [JENSEN & AL., 2008B]
Abstract
A scheme of evaluating an impact of a given scientific paper based on importance of papers quoting it is investigated. Introducing a weight of a given citation, dependent on the previous scientific achievements of the author of the citing paper, we define the weighting factor of a given scientist. Technically the weighting factors are defined by the components of the normalized leading eigenvector of the matrix describing the citation graph. The weighting factor of a given scientist, reflecting the scientific output of other researchers quoting his work, allows us to define weighted number of citation of a given paper, weighted impact factor of a journal and weighted Hirsch index of an individual scientist or of an entire scientific institution.
Abstract
The qualitative label ‘international journal’ is used widely, including in national research quality assessments. We determined the practicability of analysing internationality quantitatively using 39 conservation biology journals, providing a single numeric index (IIJ) based on 10 variables covering the countries represented in the journals’ editorial boards, authors and authors citing the journals’ papers. A numerical taxonomic analysis refined the interpretation, revealing six categories of journals reflecting distinct international emphases not apparent from simple inspection of the IIJs alone. Categories correlated significantly with journals’ citation impact (measured by the Hirsch index), with their rankings under the Australian Commonwealth’s ‘Excellence in Research for Australia’ and with some countries of publication, but not with listing by ISI Web of Science. The assessments do not reflect on quality, but may aid editors planning distinctive journal profiles, or authors seeking appropriate outlets.