Search Results

You are looking at 101 - 110 of 407 items for :

  • "Citation analysis" x
  • Refine by Access: All Content x
Clear All
Scientometrics
Authors: Bárbara Lancho-Barrantes, Vicente Guerrero-Bote, and Félix Moya-Anegón

Abstract  

A study is described of the rank/JIF (Journal Impact Factor) distributions in the high-coverage Scopus database, using recent data and a three-year citation window. It includes a comparison with an older study of the Journal Citation Report categories and indicators, and a determination of the factors most influencing the distributions. While all the specific subject areas fit a negative logarithmic law fairly well, those with a greater External JIF have distributions with a more sharply defined peak and a longer tail—something like an iceberg. No S-shaped distributions, such as predicted by Egghe, were found. A strong correlation was observed between the knowledge export and import ratios. Finally, data from both Scopus and ISI were used to characterize the rank/JIF distributions by subject area.

Restricted access

Abstract  

In order to measure the degree to which Google Scholar can compete with bibliographical databases, search results from this database is compared with Thomson’s ISI WoS (Institute for Scientific Information, Web of Science). For earth science literature 85% of documents indexed by ISI WoS were recalled by Google Scholar. The rank of records displayed in Google Scholar and ISI WoS, is compared by means of Spearman’s footrule. For impact measures the h-index is investigated. Similarities in measures were significant for the two sources.

Restricted access
Scientometrics
Authors: Mark Elkins, Christopher Maher, Robert Herbert, Anne Moseley, and Catherine Sherrington

Abstract  

To determine the degree of correlation among journal citation indices that reflect the average number of citations per article, the most recent journal ratings were downloaded from the websites publishing four journal citation indices: the Institute of Scientific Information’s journal impact factor index, Eigenfactor’s article influence index, SCImago’s journal rank index and Scopus’ trend line index. Correlations were determined for each pair of indices, using ratings from all journals that could be identified as having been rated on both indices. Correlations between the six possible pairings of the four indices were tested with Spearman’s rho. Within each of the six possible pairings, the prevalence of identifiable errors was examined in a random selection of 10 journals and among the 10 most discordantly ranked journals on the two indices. The number of journals that could be matched within each pair of indices ranged from 1,857 to 6,508. Paired ratings for all journals showed strong to very strong correlations, with Spearman’s rho values ranging from 0.61 to 0.89, all p < 0.001. Identifiable errors were more common among scores for journals that had very discordant ranks on a pair of indices. These four journal citation indices were significantly correlated, providing evidence of convergent validity (i.e. they reflect the same underlying construct of average citability per article in a journal). Discordance in the ranking of a journal on two indices was in some cases due to an error in one index.

Restricted access

Abstract  

The investigators studied author research impact using the number of citers per publication an author’s research has been able to attract, as opposed to the more traditional measure of citations. A focus on citers provides a complementary measure of an author’s reach or influence in a field, whereas citations, although possibly numerous, may not reflect this reach, particularly if many citations are received from a small number of citers. In this exploratory study, Web of Science was used to tally citer and citation-based counts for 25 highly cited researchers in information studies in the United States and 26 highly cited researchers from the United Kingdom. Outcomes of the tallies based on several measures, including an introduced ch-index, were used to determine whether differences arise in author rankings when using citer-based versus citation-based counts. The findings indicate a strong correlation between some citation and citer-based measures, but not with others. The findings of the study have implications for the way authors’ research impact may be assessed.

Restricted access

Abstract  

The most popular method for judging the impact of biomedical articles is citation count which is the number of citations received. The most significant limitation of citation count is that it cannot evaluate articles at the time of publication since citations accumulate over time. This work presents computer models that accurately predict citation counts of biomedical publications within a deep horizon of 10 years using only predictive information available at publication time. Our experiments show that it is indeed feasible to accurately predict future citation counts with a mixture of content-based and bibliometric features using machine learning methods. The models pave the way for practical prediction of the long-term impact of publication, and their statistical analysis provides greater insight into citation behavior.

Restricted access

Abstract  

Hirsch’s concept of h-index was used to define a similarity measure for journals. The h-similarity is easy to calculate from the publicly available data of the Journal Citation Reports, and allows for plausible interpretation. On the basis of h-similarity, a relative eminence indicator of journals was determined: the ratio of the JCR impact factor to the weighted average of that of similar journals. This standardization allows journals from disciplines with lower average citation level (mathematics, engineering, etc.) to get into the top lists.

Restricted access

Abstract  

This paper reports on a bibliometric study of the characteristics and impact of research in the library and information science (LIS) field which was funded through research grant programs, and compares it with research that received no extra funding. Seven core LIS journals were examined to identify articles published in 1998 that acknowledge research grant funding. The distribution of these articles by various criteria (e.g., topic, affiliation, funding agency) was determined. Their impact as indicated by citation counts during 1998–2008 was evaluated against that of articles without acknowledging extra funding and published in the same journals in the same year using citation data collected from Scopus’ Citation Tracker. The impact of grant-funded research as measured by citation counts was substantially higher than that of other research, both overall and in each journal individually. Scholars from outside LIS core institutions contributed heavily to grant-funded research. The two highest-impact publications by far reported non-grant-based research, and grant-based funding of research reported in core LIS journals was biased towards the information retrieval (IR) area, particularly towards research on IR systems. The percentage of articles reporting grant-funded research was substantially higher in information-oriented journals than in library-focused ones.

Restricted access

Abstract

The citation distribution of a researcher shows the impact of their production and determines the success of their scientific career. However, its application in scientific evaluation is difficult due to the bi-dimensional character of the distribution. Some bibliometric indexes that try to synthesize in a numerical value the principal characteristics of this distribution have been proposed recently. In contrast with other bibliometric measures, the biases that the distribution tails provoke, are reduced by the h-index. However, some limitations in the discrimination among researchers with different publication habits are presented in this index. This index penalizes selective researchers, distinguished by the large number of citations received, as compared to large producers. In this work, two original sets of indexes, the central area indexes and the central interval indexes, that complement the h-index to include the central shape of the citation distribution, are proposed and compared.

Restricted access

Introduction Citation analysis as a mature quantitative research method in bibliometrics and scientometrics has been applied to many disciplines at home and abroad, especially in describing evolution of disciplines, evaluating

Restricted access

Budapest, Leiden, Leuven, Beijing, Shanghai, etc.) or as independent commercial enterprises (e.g., Science-Metrix in Montreal). Two major companies (Thomson Reuters and Elsevier) are also active in this market. In other words, citation analysis has become

Open access