Knowledge flow between public and private sectors is widely recognized as a way to stimulate innovation and regional development, particularly in science parks. This work employs a bibliometric approach, based on patent citation, non-patent citation, and public–private co-authorship of scientific publications to measure the use of public research in Hsinchu Science Park (HSP) in Taiwan. The result shows that the number of jointly published papers has increased constantly, implying the collaboration between HSP and universities has become more common. However, from the aspect of co-patenting, patent citation, and non-patent reference, technological innovation stemming from public research needs to be enhanced.
This paper presents a methodology for measuring the improvements in efficiency and adjustments in the scale of R&D (Research
& Development) activities. For this purpose, this study decomposes academic productivity growth into components attributable
to (1) world academic frontier change, (2) R&D efficiency change, (3) human capital accumulation, and (4) capital accumulation.
The world academic frontier at each point in time is constructed using data envelopment analysis (DEA). This study calculates
each of the above four components of academic productivity for 27 countries over 1990–2003, and finds that the components
which contribute to academic productivity growth vary with the different countries’ characteristics and development stages.
Human capital has more weight in terms of the quantity of academic research, and capital accumulation plays a more important
role in the citation impact of academic research.
In scientometrics for trend analysis, parameter choices for observing trends are often made ad hoc in past studies. For examples,
different year spans might be used to create the time sequence and different indices were chosen for trend observation. However,
the effectiveness of these choices was hardly known, quantitatively and comparatively. This work provides clues to better
interpret the results when a certain choice was made. Specifically, by sorting research topics in decreasing order of interest
predicted by a trend index and then by evaluating this ordering based on information retrieval measures, we compare a number
of trend indices (percentage of increase vs. regression slope), trend formulations (simple trend vs. eigen-trend), and options
(various year spans and durations for prediction) in different domains (safety agriculture and information retrieval) with
different collection scales (72500 papers vs. 853 papers) to know which one leads to better trend observation. Our results
show that the slope of linear regression on the time series performs constantly better than the others. More interestingly,
this index is robust under different conditions and is hardly affected even when the collection was split into arbitrary (e.g.,
only two) periods. Implications of these results are discussed. Our work does not only provide a method to evaluate trend
prediction performance for scientometrics, but also provides insights and reflections for past and future trend observation