The science-related material published in newspapers can be analysed to provide insight into the biases and techniques involved in transferring knowledge from the science community to the general public. A part of such studies can be carried out in quantitative terms. Three such quantitative approaches are illustrated here: (1) measurement of space devoted to science; (2) derivation of readability indices; (3) content analysis.
A recent extensive review of research in British universities has produced a research rating for each university department based primarily on peer review of the department's publications. In this preliminary study, we compare these ratings with publication and citation data for the chemistry departments at two British universities. The results underline the importance of the most productive researchers in departments. This point is supported by citation data from a chemical engineering department.
Authors:Sally Hodges, B. Hodges, A. Meadows, Micheline Beaulieu, and D. Law
Recent years have seen a growing interest in the use of quantitative parameters for assessing the quality of research carried out at universities. In the UK, university departments are now subject to regular investigations of their research standing. As part of these investigations, a considerable amount of quantitative (as well as qualitative) information is collected from each department. This is made available to the panels appointed to assess research quality in each subject area. One question that has been raised is whether the data can be combined in some way to provide an index which can help guide the panels' deliberations. This question is looked at here via a detailed examination of the returns from four universities for the most recent (1992) research assessment exercise. The results suggest that attempts to derive an algorithm are only likely to be helpful for a limited range of subjects.