One of the basic dependent variables in the sociology of science is the rate at which scientific knowledge advances. Sociologists of science have in the past assumed that the rate of scientific advance was a function of the number of talented people entering science. This assumption was challenged by Derek Price who argued that as the number of scientists increased the number of high quality scientists would increase at a slower rate. This paper reports the results of an empirical study of changes in the size of academic physics in the U. S. between 1963 and 1975. In each year we count the number of new Assistant Professors appointed in Ph. D.-granting departments. During the early 1960s there was a sharp increase in the size of entering cohorts followed by a sharp decline. A citation analysis indicates that the proportion of each cohort publishing work which was cited at least once in the first three years after appointment was relatively constant. This leads to the conclusion that the number of scientists capable of contributing to the advance of scientific knowledge through their published research is a linear function of the total number of people entering science.
I think that most of the problems mentioned in the GS paper are caused by natural evolutionary aspects of the discipline. It cannot be doubted that BIS is growing into a more and more professional research discipline. There are indeed problems of quality and of the fact that researchers have different origins. The first problem is evoluating in the right direction and the second one should be considered as an enrichment rather than as a negative fact. One must admit, nevertheless, that different subdisciplines will tend to live their own life, but that continuing contacts (such as joint conferences) remain important and are necessary for the further development of all these subdisciplines.
The growth rate of scientific publication has been studied from 1907 to 2007 using available data from a number of literature
databases, including Science Citation Index (SCI) and Social Sciences Citation Index (SSCI). Traditional scientific publishing,
that is publication in peer-reviewed journals, is still increasing although there are big differences between fields. There
are no indications that the growth rate has decreased in the last 50 years. At the same time publication using new channels,
for example conference proceedings, open archives and home pages, is growing fast. The growth rate for SCI up to 2007 is smaller
than for comparable databases. This means that SCI was covering a decreasing part of the traditional scientific literature.
There are also clear indications that the coverage by SCI is especially low in some of the scientific areas with the highest
growth rate, including computer science and engineering sciences. The role of conference proceedings, open access archives
and publications published on the net is increasing, especially in scientific fields with high growth rates, but this has
only partially been reflected in the databases. The new publication channels challenge the use of the big databases in measurements
of scientific productivity or output and of the growth rate of science. Because of the declining coverage and this challenge
it is problematic that SCI has been used and is used as the dominant source for science indicators based on publication and
citation numbers. The limited data available for social sciences show that the growth rate in SSCI was remarkably low and
indicate that the coverage by SSCI was declining over time. National Science Indicators from Thomson Reuters is based solely
on SCI, SSCI and Arts and Humanities Citation Index (AHCI). Therefore the declining coverage of the citation databases problematizes
the use of this source.