Search Results

You are looking at 181 - 190 of 463 items for :

  • "Impact factors" x
  • Refine by Access: All Content x
Clear All

Abstract  

Based on the citation data of journals covered by the China Scientific and Technical Papers and Citations Database (CSTPCD), we obtained aggregated journal-journal citation environments by applying routines developed specifically for this purpose. Local citation impact of journals is defined as the share of the total citations in a local citation environment, which is expressed as a ratio and can be visualized by the size of the nodes. The vertical size of the nodes varies proportionally to a journal’s total citation share, while the horizontal size of the nodes is used to provide citation information after correction for the within-journal (self-) citations. In the “citing” environment, the equivalent of the local citation performance can also be considered as a citation activity index. Using the “citing” patterns as variables one is able to map how the relevant journal environments are perceived by the collective of authors of a journal, while the “cited” environment reflects the impact of journals in a local environment. In this study, we analyze citation impacts of three Chinese journals in mathematics and compare local citation impacts with impact factors. Local citation impacts reflect a journal’s status and function better than (global) impact factors. We also found that authors in Chinese journals prefer international instead of domestic ones as sources for their citations.

Restricted access

Abstract  

This study investigates the scientific output and publication patterns of Korean biotechnology before and after the start of the Korean Biotechnology Stimulation Plans (1994–2007), and then compares the results with publication data from the same time periods for Japan, the People’s Republic of China, Taiwan and Singapore. For this study, 14,704 publications, published by at least one researcher from one of the five Asian nations (indexed by SCI Expanded during the years 1990–1993 and the years 2000–2003), were considered. A marked increase of Korean research output in biotechnology was largely influenced by an increasing tendency for researchers to enter the field of biotechnology and by increased expenditures for R&D activity through the Korean Biotechnology Stimulation Plans. In addition, the SCI Expanded coverage of national journals affected the scientific output and publication patterns of Japanese and Korean researchers. Looking at the Korean publications by collaboration type, international collaboration leads to more publications in mainstream journals of high impact factors than local and domestic collaborations for the two periods. However, although the Korean Biotechnology Stimulation Plans were followed by a remarkable increase in South Korea’s research output, this increase has not been accompanied by growth in the quality of those publications in terms of impact factors of journals for Korean publications.

Restricted access

Abstract  

Using the MACTOR (Matrix of Alliances and Conflicts: Tactics, Objectives and Recommendations) method, a set of 13 related journals covering the subject category “Chemistry, Multidisciplinary” was analyzed in terms of direct and indirect reciprocal influences (measured by relatedness indexes Rji), their positions towards a generic set of common objectives (total cites; impact factor; immediacy index; number of published articles; cited half life) and the convergences (Actors x Actors and Actors x Objectives) existing in the above-mentioned relatedness network. The study identified 4 types of actors: dominant (3), independent (8), relay (1) and dominated (1). Maps of: influences and dependences between actors; convergence between actors; net distances between actors and actors-objectives relationships are presented, together with short interpretations. Defining scientific journals as actors on a specific “knowledge market”, identifying influences and dependences between them and positioning these journals towards a set of measurable objectives creates an interesting possibility to define “relationships of power” of a strategic nature and enables the introduction of more complex future-oriented scientometric analyses than those based solely on standard bibliometric indicators such as the impact factor.

Restricted access

Abstract  

Multivariate methods were successfully employed in a comprehensive scientometric analysis of geostatistics research, and the publications data for this research came from the Science Citation Index and spanned the period from 1967 to 2005. Hierarchical cluster analysis (CA) was used in publication patterns based on different types of variables. A backward discriminant analysis (DA) with appropriate statistical tests was then conducted to confirm CA results and evaluate the variations of various patterns. For authorship pattern, the 50 most productive authors were classified by CA into 4 groups representing different levels, and DA produced 92.0% correct assignment with high reliability. The discriminant parameters were mean impact factor (MIF), annual citations per publication (ACPP), and the number of publications by the first author, for country/region pattern, CA divided the top 50 most productive countries/regions into 4 groups with 95.9% correct assignments, and the discriminant parameters were MIF, ACCP, and independent publication (IP); for institute pattern, 3 groups were identified from the top 50 most productive institutes with nearly 88.0% correct assignment, and the discriminant parameters were MIF, ACCP, IP, and international collaborative publication; last, for journal pattern, the top 50 most productive journals were classified into 3 groups with nearly 98.0% correct assignment, and its discriminant parameters were total citations, impact factor and ACCP. Moreover, we also analyzed general patterns for publication document type, language, subject category, and publication growth.

Restricted access

Abstract  

The long-term influence and contribution of research can be evaluated relatively reliably by bibliometric citation analysis. Previously, productivity of nations has been estimated by using either the number of published articles or journal impact factors and/or citation data. These studies show certain trends, but detailed analysis is not possible due to the assumption that all articles in a journal were equally cited. Here we describe the first comprehensive, longterm, nationwide analysis of scientific performance. We studied the lifetime research output of 748 Finnish principal investigators in biomedicine during the years 1966–2000, analysed national trends, and made a comparison with international research production. Our results indicate that analyses of the scientific contribution of persons, disciplines, or nations should be based on actual publication and citation counts rather than on derived information like impact factors. 51% of the principal investigators have published altogether 75% of the articles; however, the whole scientific community has contributed to the growth of biomedical research in Finland since the Second World War.

Restricted access

Abstract  

For each of the years 2003, 2004, and 2005 the number of citations for individual papers published in Physics in Medicine and Biology was compared to the mean quality-score assigned to the manuscript by two independent experts as part of the normal peer review process. A low but statistically significant correlation was found between citations and quality score (1 best to 5 worst) for every year: 2003: −0.227 (p < 0.001); 2004: −0.238 (p < 0.001); 2005: −0.154 (p < 0.01). Papers in the highest quality category (approximately 10 per cent of those published) were cited about twice as often as the average for all papers. Data were also examined retrospectively by dividing the papers published in each year into five citation quintiles. A paper of the highest quality is about ten times more likely to be found in the most cited quintile than in the least cited quintile. By making the assumption that the mean number of citations per paper is a reasonable surrogate for the impact factor, it was also shown that the impact factor for Physics in Medicine and Biology could be increased substantially by rejecting more papers based on the reviewers’ scores. To accomplish this, however, would require a reduction in the acceptance rate of manuscripts from about 50 per cent to near 10 per cent.

Restricted access
Scientometrics
Authors:
Donatella Ugolini
,
Marco Cimmino
,
Cristina Casilli
, and
Giuseppe Mela

Abstract  

This study evaluates the distribution of papers published by European Union (EU) authors in ophthalmological journals from 1995 to 1997. The impact of ophthalmological research in the EU is compared with that produced in other countries and trends of research are highlighted through the keywords analysis. Data of articles published in ophthalmological journals (ISI Subject Category) were downloaded. Mean Impact Factor, source country population and gross domestic product were analyzed. A special purpose software for keyword elaboration was utilized. 11,219 papers were published in the world in the ophthalmological journals: 34.8% came from the EU (UK, Germany, France, Italy and the Netherlands ranking at the top) and 40.7% from the US. The mean Impact Factor of EU papers was 0.8 in comparison with 1.5 in the US. Despite the limitations of the existing methods, bibliometric findings are useful for the monitoring of research trends. The keywords analysis shows that the leading fields of research were retinal pathologies for diseases and keratoplasty for surgical procedures. It also suggests that keywords are overused, and urges minimization of this as well as standardization among journal editors.

Restricted access

The SCI Journal Citation Reports: A potential tool for studying journals?

I. Description of the JCR journal population based on the number of citations received, number of source items, impact factor, immediacy index and cited half-life

Scientometrics
Authors:
Marie-Hélène Magri
and
Aline Solari

Abstract  

In this paper, we analysed six indicators of the SCI Journal Citation Reports (JCR) over a 19-year period: number of total citations, number of citations to the two previous years, number of source items, impact factor, immediacy index and cited half-life. The JCR seems to have become more or less an authority for evaluating scientific and technical journals, essentially through its impact factor. However it is difficult to find one's way about in the impressive mass of quantitative data that JCR provides each year. We proposed the box plot method to aggregate the values of each indicator so as to obtain, at a glance, portrayals of the JCR population from 1974 to 1993. These images reflected the distribution of the journals into 4 groups designated low, central, high and extreme. The limits of the groups became a reference system with which, for example, it was rapidly possible to situate visually a given journal within the overall JCR population. Moreover, the box plot method, which gives a zoom effect, made it possible to visualize a large sub-population of the JCR usually overshadowed by the journals at the top of the rankings. These top level journals implicitly play the role of reference in evaluation processes. This often incites categorical judgements when the journals to be evaluated are not part of the top level. Our «rereading» of the JCR, which presented the JCR product differently, made it possible to qualify these judgements and bring a new light on journals.

Restricted access

Abstract  

We comment on a letter toNature in 1996 on the long term decline of Indian science pointing out methodological reasons why the (SCI) data used by the authors do not unambiguously lead to their stated conclusions. Our arguments are based on the contention that no valid statement on change in a country's output may be made for a period in which the journal coverage from that country in SCI has changed significantly. We have suggested that for longitudinal comparisons of country level performance, it should be verified that the journals from that country in SCI remained constant within the period. This could be ensured if the country of publication of journals could be included as a field in the SCI database. We define a Visibility Index as the cumulated impact and derive a relation to estimate change in visibility combining changes in output and average impact. In the period during which Indian journal coverage remained unchanged, a detailed analysis of output for two years (1990–94) leads us to conclude that, with the exception of Agriculture, there has been an increase in publication in virtually every field, with significant increase in the overall mean Impact Factor. At least 25 subfields have been identified with statistically significant increase in mean Impact Factor and Visibility. The impact of foreign collaboration on visibility has also been considered. In conclusion we touch upon the question of citation as a performance indicator for Third World countries as high citation and relevance may be in conflict as objectives.

Restricted access

Abstract  

This paper examines the contribution of Indian universities to the mainstream scientific literature during 1987–1989 along two distinct, but inter-related dimensions of quantity and quality of research output. The quantity of output is assessed through the number of articles published in journals covered byScience Citation Index, while the quality of output is assessed through the impact factors of journals in which the articles are published. The impact factors are normalized to eliminate the confounding effects of their covariates,viz. the subject field and the nature of journal. A number of relative indicators are constructed for inter-field and inter-institution comparisons,viz. publication effectiveness index,1 relative quality index,2 activity index3 and citability index4. Inter-field comparisons are made at the level of eight macrofields: Mathematics, Physics, Chemistry, Biology, Earth & Space Sciences, Agriculture, Medical Sciences and Engineering & Technology. Inter-institution comparisons cover thirty three institutions which had published at least 150 articles in three years. The structure of correlations of these institutions with eight macrofields is analyzed through correspondence analysis of the matrices of activity and citability profiles. Correspondence analysis yields a mapping of institutions which reveals the structure of science as determined by the cumulative effect of resource allocation decisions taken in the past for different fields and institutions i.e. the effect of national science policy.

Restricted access