Open access

Abstract

The purpose of our contribution is to discuss shortcomings of purely descriptive quantitative evaluation of research policies – based either on inputs (public investment, number of researchers), or outputs (publications, EU grants, number of patents). To give an example, we compare selected indicators across Visegrad countries in the period between 2006 and 2015. We conclude that both quantitative and qualitative perspectives as well as societal and political contexts should be taken into account when the performance of any R&D system and the impact of public investments into a public R&D sector are scrutinized.

Abstract

The purpose of our contribution is to discuss shortcomings of purely descriptive quantitative evaluation of research policies – based either on inputs (public investment, number of researchers), or outputs (publications, EU grants, number of patents). To give an example, we compare selected indicators across Visegrad countries in the period between 2006 and 2015. We conclude that both quantitative and qualitative perspectives as well as societal and political contexts should be taken into account when the performance of any R&D system and the impact of public investments into a public R&D sector are scrutinized.

Introduction

It has been assumed that activities connected with science, technology and innovation (STI) have a direct impact on social, economic and sustainable development. Investing in research is seen as crucial not only for improving technologies but also for social and economic wellbeing (Macilwain, 2010). In many countries – developed as well as developing – this belief resulted in the increasing number of STI bodies, new legal STI frameworks, various STI policy instruments and finally in larger investment into scientific research, technological development and innovation, both in public and private sectors. Consequently, the world has witnessed an exponential growth in scientific outputs worldwide – scientific articles and patents (UNESCO, 2010). At the same time, increasing attention has been paid to evaluation and comparative studies.

One way to analyze and compare various R&D systems and evaluate their performance is to use qualitative inputs and outputs. These indicators are relatively easy to collect, and they have been used by organizations such as the OECD, the World Bank or by the European Commission. Also various benchmarking exercises including for example the European Innovation Scoreboard1 have been using input and output indicators in order to assess the innovation performance of individual countries.

Overall R&D spending, public and private expenditures on R&D, share of R&D expenditures on GDP, number of R&D personnel and others are used when measuring the inputs. Outputs can be represented for example by number of publications, number of patents, income of research organizations from industry or the number of prestigious EU grants. All these indicators describe major characteristics of each R&D system, its performance and development in time.

One of the most important input indicators is the volume of resources spent on R&D. In general, industrialized and developed countries believe that R&D investment contributes to economic growth. However, a connection between R&D spending and direct commercial impact does not have to be straightforward (Gough, 2016). Gough takes Australia and Ireland and compares their research outcomes and innovation performances. Whereas Australia has a very good record of producing high-quality research publications (12th place in the Nature Index 2015 Global ranking), it lags behind in the 2016 Global Innovation Index (73rd in the innovation efficiency ratio). Ireland ranks 8th in the innovation efficiency, while it holds 28th position in the Nature Index 2015 Global ranking, which reflects highly cited research papers.

Meo et al. (2013) analyzed the productivity and visibility of research papers in selected Asian countries. They discovered that spending on R&D, number of universities and scientific indexed journals are positively associated with the total number of research documents citations per document and H-index in various science and social science fields. However, the authors did not find connection between the per capita GDP and research outcomes.

We in general agree with the statement that resources should be invested into science and R&D activities – both public and private. It is highly probable that investment into R&D will eventually lead to new discoveries, new technologies and more innovative products. The volume of resources invested into the system (mainly the public ones) is very important. Yet, it should not be the only indicator (complemented by other quantitative ones) to look at when describing the R&D system, its quality and performance.

For example, the European Union (EU) has not been focusing purely on quantitative indicators when designing and implementing its R&D policy. When the EU launched its ambitious plan to invest 3 percent of the EU's GDP into R&D one decade ago, the European Commission (EC) at the same time stressed the fact that attention should be paid to impacts rather than to inputs. The Commission added that the composition of R&D spending should be carefully considered, and at the same time, the conditions for R&D undertaken in private sector should be improved. Furthermore, R&D and innovation should be more linked together (European Commission, 2010).

However, measuring impact of public investment into R&D has also been challenged. For example, the Australian Research Quality Framework (RQF) recognized the limitations of technometrics and sociometrics as underdeveloped for assessing the impact or for evaluating the allocation of research funds (Donovan, 2007). In general, the use of indicators in research policy and evaluation has been perceived as problematic (Ràfols, 2019).

The main question we would like to tackle in our contribution is whether quantitative indicators alone give us an appropriate picture of an individual R&D system. What else should be taken into consideration in this respect? Why science policy2 and its components are important? When tackling those questions, we look as an example at four Visegrad countries (V4) – Czech Republic, Hungary, Poland and Slovakia.

We are aware of the fact that in the majority of developed countries, private investment constitutes a more significant part of R&D investment in general (EC, 2018; OECD, 2018). However, for the purpose of this paper, we focus specifically on public investment in selected sectors. As Soete, Verspagen, and Ziesemer (2020) argue evaluation of public policies targeted at the public sector (universities and public or semi-public research organizations) so far has received a limited attention in the econometric literature.

In order to get comparable data across all four countries, we included all public institutions and excluded private sector. By public institutions, we mean higher education institutions, academies of sciences, governmental research organizations, teaching hospitals as well as private not-for profit research organizations. Private not-for profit research organizations constitute rather insignificant part of R&D system in V4 countries. We compare the data from the time period between 2006 and 2015. At the input side, we analyze mainly the public expenditure into selected sectors. At the output side, we focus on the number of research publications, number of patents filed by the public sector and the number of ERC grant applications.

The paper is structured in the following way. In the next two sections, we compare selected inputs and outputs of the Czech, Hungarian, Polish and Slovak R&D systems. In the concluding part, we elaborate our standpoint that the performance of public R&D systems is much more complex and goes beyond quantitative indicators. We also suggest some directions for further research. After all, our paper is also meant to initiate further discussion about science policy and related public policies not only in Visegrad countries (V4), but also in other Central and Eastern European countries, who are in similar state of development.

Comparing selected inputs

In Table 1, we can see the R&D expenditure from public resources in the four countries. In the period between 2006 and 2015 the public investment into R&D was more or less growing in three V4 countries with the exception of Hungary.

Table 1.

Public R&D expenditure in selected sectors of performance (in mil. EURO)

2006200720082009201020112012201320142015
CZ551658735752745843847853845913
HU366387395354349350334321310308
POL8109711,2441,1941,4931,4741,6181,4741,5401,604
SVK101126143144188215227223267279

Source: Eurostat, available from: https://ec.europa.eu/eurostat/data/database.

In Fig. 1 we can see the volume of public expenditure into selected sectors related to GDP of individual countries. While in the Czech Republic and Slovakia we can see a gradual increase, Poland witnessed a stagnation. In Hungary the trend was even decreasing.

Fig. 1.
Fig. 1.

Public R&D expenditure in selected sectors of performance, Source: Eurostat, Available at https://ec.europa.eu/eurostat/data/database

Citation: Hungarian Educational Research Journal 10, 4; 10.1556/063.2020.00035

When comparing R&D systems in individual countries, it is also important to take into account how large the public R&D system is. In order to do that, we use the full-time equivalent number of researchers in selected sectors as a benchmarking reference. The following graph shows that in Hungary and Slovakia the numbers of researchers have been stagnating between 2006 and 2015, in Poland slightly waving, while in the Czech Republic they have been continually increasing with one exception in 2009 (Fig. 2).

Fig. 2.
Fig. 2.

Researchers in Public Sector, Source: Eurostat, Available at: https://ec.europa.eu/eurostat/data/database

Citation: Hungarian Educational Research Journal 10, 4; 10.1556/063.2020.00035

When we divide public expenditures by the number of researchers in the selected sectors to get public spending per one researcher, we get a somehow different picture than in two previous graphs. With the exception of Slovakia, the expenditure on one researcher were in the rest of the countries somehow stagnating between 2011 and 2015. In Hungary the number in 2015 was even lower than in 2006 (Fig. 3).

Fig. 3.
Fig. 3.

Public R&D expenditure per one researcher (FTE), Source: Authors

Citation: Hungarian Educational Research Journal 10, 4; 10.1556/063.2020.00035

The above mentioned data could be interpreted in the following way. Whereas Slovakia did well in terms of public investment into selected R&D sectors, Hungary is very much lagging behind. Czech Republic and Poland did rather well, however, after a few years of a positive trend they stagnated.

It is important to note rather significant differences in absolute numbers per researcher between the countries. In 2015 the volume of public investment into selected sector per researcher was 24, 000 EURO in Slovakia, 30, 000 Euro in Poland and Hungary and 49, 000 EURO in the Czech Republic. In comparison the average for EU28 countries for the same year was 89, 000 EURO. For the sake of simplification, we can say that Hungary and Poland were investing similar amount from public resources per one researcher during the analyzed period. The Czech Republic invested twice as much as Slovakia, yet it reached only 55 percent of EU28 average. For more detailed analysis, we would need to know the breakdown of this expenditure such as investment, operating costs and personal costs.

Comparing selected outputs

Public investment into R&D is in fact one of the most important indicators describing an individual R&D system and country priorities. Yet, it is usually complemented by output indicators. They can capture personnel (number of grant holders, number of Nobel prize winners etc.), number of scientific publications, inputs related to protection of intellectual properties (e.g. number of patents) or outputs indicating the level of knowledge transfer and commercialization of R&D results such as a number of licensing agreements, income from licensing, number of spin-off companies created by research organizations or the volume of contract research. All the above-mentioned indicators are relevant as their level can indicate what a society receives back as an output of public investment in R&D. However, some of the numbers are difficult to obtain, or they are not easily comparable. In the next section, we use only the following selected output indicators.

On the output side, we first look at the number of scientific publications – for this purpose we use the Science Citation Index (SCI) and Social Sciences Citation Index (SSCI). We simply presume that authors of scientific publications are almost exclusively associated with the public research sector groups as identified above.

First, we have to mention that altogether, the number of publications increased from approximately 480, 000 into 635, 000 in EU28 countries (World Bank, 2018). As we can see in Table 2, within one decade, the number of scientific publications almost doubled in the Czech Republic and Slovakia. Poland witnessed more moderate increase (around one half), whereas in Hungary the number was rather stagnating over the time.

Table 2.

Number of scientific publications included in SCI and SSCI

2006200720082009201020112012201320142015
CZ8,8389,98810,64911,19712,67313,54413,94814,40115,67516,873
HU5,5305,8846,4086,0295,8706,4196,5706,4686,8586,566
POL21,26721,56823,37623,47024,55125,85728,11530,06431,77932,776
SVK2,6442,8003,3323,1643,6383,8994,2614,6165,1395,207

Source: The World Bank, Available from: https://data.worldbank.org/indicator/IP.JRN.ARTC.SC.

However, when we compare the number of scientific publications per one researcher in each year, the overall picture slightly changes. The most productive is the Czech Republic – 0,.89 publication per one researcher – which is even above the EU28 average (0,.69). Hungary and Poland approached the EU28 average in 2015, and Slovakia was still lagging behind with 0,.44 publication per one researcher (Fig. 4).

We are aware of the fact that SCI and SSCI do not cover the productivity of the whole R&D system. However, it can give us some idea how productive researchers are in V4 countries. At the same time, the productivity of the system as well as institutions, or even individuals cannot be evaluated only by the number of publications in various databases. For detailed analyses we would have to use other rather qualitative parameters such as the number of publications in top 10 percent of journals, citation index, influence score, retracted articles published in predatory journals etc.

Fig. 4.
Fig. 4.

Number of scientific publications per one researcher, Source: Authors

Citation: Hungarian Educational Research Journal 10, 4; 10.1556/063.2020.00035

Another way to look at the performance of the R&D system supported by the public resources is the number of applications to prestigious ERC grants by individual countries. For this reason, we look at the number of ERC grants awarded by the European Research Council.3 The ERC scheme is highly competitive. For example, in 2018 the success rate in the ERC starting grants was around 13 percent. Young researchers submitted 3,3 170 applications, only 403 were granted a support (ERC, 2018).

Fig. 5 shows the number of all applications (Starting, Consolidator, Advanced, Proof of Concepts and Synergy) submitted by the V4 countries to the ERC scheme. We can see that Polish applicants submitted the highest number of proposals whereas the number of proposals submitted by Czechs and Hungarians was comparable. Slovak applications were constantly low over the period with a maximum of 36 applications in 2007. When taking into account the number of researchers in the system, Czech and Hungarian researchers are the most active.

Fig. 5.
Fig. 5.

ERC proposals evaluated, Source: Technology Center CAS; Available at: http://svizualizace.tc.cas.cz/maps/main.html

Citation: Hungarian Educational Research Journal 10, 4; 10.1556/063.2020.00035

We can now have a look at how successful the applicants from the V4 countries were. However, it should be taken only as a complementary source as the number of received grants was rather limited in the discussed period. Despite decrease in the number of researchers and continuous stagnation of public resources into the R&D system, Hungary with its 59 ERC grants has been the most successful from all new EU member countries. Poland and the Czech Republic attracted each 25 ERC grants, and Slovakia was able to attract only one ERC grant. Hungary was the most successful in 2016 (23 percent success rate), and the Czech Republic in 2015 with12 percent success rate (TC AV, 2018).

Discussion & conclusions

Relevant and comparable numbers are indispensable for analyzing and evaluating R&D measures and science policies. Both inputs and outcomes help us look thoroughly into the performance of R&D systems. However, if we want to study the details and for example the drives for changes in institutional or systemic behavior, quantitative data which are often used for international comparisons should be interpreted in context of qualitative factors and other socio-economic developments taking place in the society or specifically in the field of science or innovation policy. Pure macro-economic approach has its limitations. It can help us understand long-term effects of investing or cutting investment in public R&D on economic growth, yet, does not explain the effects of policy measures implemented at a finer level of detail (Soete, 2017). Therefore, we think that both data and developments are of the same importance for better understanding of functioning of any R&D system as well as for potential learning and lesson drawing.

When analyzing the performance of the system, we should very carefully look at its structure – whether it is homogenous or heterogeneous and fragmented in comparison to other systems of similar size. How many research organizations do operate within the system? Does the system include specialized institutions for basic research – for example the academy of sciences? How big is such a system? What is the role of universities in the R&D system? How big is the private higher education sector? Which fields are covered by existing research organizations and research infrastructures? What is the level of internationalization of the R&D system? Answers to these and other questions are needed in order to understand functioning of the system.

Another important factor is the way R&D is financed in each country. There are basically two main streams through which research organizations can be funded from public resources – either via grants provided by grant agencies, ministries and other providers, or via institutional support. What is the ratio between institutional funding and grants? What is the ratio between support for basic research and applied research? How much resources get research organizations from business and from international sources?

Also the internal structure of the public resources invested in the public R&D sector should be carefully scrutinized. During the analyzed period, all V4 countries used European Structural Funds4 to support their research – to build new research infrastructures and purchase research equipment. Investment into research infrastructure and research equipment mainly through European Regional and Development Fund (ERDF) played a very significant role in shaping the overall public research system. What was the role of European Structural Funds in each country? Were completely new research centers built in the country? What was their position within the system?

Regarding outputs, it is very important to study what kind of system of financing and research evaluation was in place during the analyzed period. For example, in the Czech Republic, a purely quantitative system was implemented in the period between 2010 and 2017. The system called “coffee grinder” was based on a sophisticated mathematical formula transforming the points assigned to various research outputs (journal articles, books, conference contributions, patents, prototypes etc.) into institutional funding of research organizations (for more details see for example Arnold & Mahieu, 2011; Good, Vermeulen, Tiefenthaler, & Arnold, 2015; Young, 2014). Taking this fact into consideration, it is clear that there was an increased pressure for publishing as much as possible notwithstanding the publisher, as the system did not differentiate between the quality of the results.

One example we did not use in our comparative exercise between the V4 countries is the number of registered patents. In the Czech Republic, in the given period, American or European patent was assigned the same number of points as a highly valued article in Nature journal. We can see from Czech Statistical Office data (2018) that the highest number of patents registered by research organizations was in 2013. After patents and utility models were removed from the methodology, not surprisingly, the number of submissions of public research organizations to the Industrial Property Office of the Czech Republic declined. Though the number of awarded patents increased between 2010 and 2013, the number of licensing agreements and income from licenses stagnated. Such a development strongly supports the hypothesis that the increase of the R&D production in the public sector in the mentioned period cannot be seen primarily as reflection of the increased efficiency of the system, but rather as a consequence of the introduction of the new methodology. The same is the situation with scientific publications and the mechanism generated a high number of low-quality or piece-meal publications, which satisfied the requirements of the “coffee grinder” and maximized the funding for the researcher, research group and the institution.

By giving the above-mentioned examples from the Czech Republic, we want to demonstrate that for understanding the system and evaluating its performance, quantitative data should be complemented by deeper knowledge of the system and relevant narrative. At the same time, we are aware of the fact that analyzing and comparing any R&D system and science policies brings a number of challenges. For example, if one system is focused on expensive biomedical research, and the other one is a high-throughput IT research in artificial intelligence, we compare incomparable. Biomedical research needs expensive facilities, clean spaces, many years of clinical trials and a long lead time. IT research can be done on reasonably cheap equipment, is quick and brings results, products and publications in months, one or two years. Reaction times between investments and first results of both systems will be very different, cost distributions as well. Methodologies will also have to differ.

Finally, we believe that the way the research and development system in each individual country is structured, how individuals and institutions are evaluated, how the interaction between research organizations and businesses is supported, how the R&D funding is awarded etc. are as important as the amount of public money which is invested into the system. This is in line with the study of Soete, Verspagen, and Ziesemer (2020). They analyzed 17 OECD countries in the period between 1975 and 2014. They discovered that public investment into R&D in general increases a country's GDP, yet they found countries in which extra investment into R&D has small effects or leads even to lower productivity. The variety of results is attributed to different externalities of R&D systems and unobserved characteristics of the innovation system. Also Danovan (2007) argues that evaluation of R&D policies based on purely qualitative indicators might fail to capture social, environmental and economic returns.

When we look at the classical 4E model of performance in public sector by Bouckaert and van Dooren (2003), they talk about economy, efficiency, effectiveness and equity measures. The span of performance in the public sector is how inputs, outputs and outcomes are related to each other. Our approach stresses the effectiveness dimension – we argue that environment and network members play as an important role as inputs and outputs. It is also important that trust was incorporated to the original model (Bouckaert & Halligan, 2008).

Finally, we argue the fact that an individual country steadily increases its investment into R&D (inputs) and generates a corresponding number of outputs does not necessarily mean that its science policy is being successfully implemented, the best projects receive the public grants, the outputs are meaningful, and a significant part of accumulated knowledge is successfully transferred into industry and utilized for the society as a whole. As we all know, best practices and measures can be implemented with various levels of success – all of them are not transferable and universally applicable across all systems. There is a number of factors influencing the success or failure of science policy – history of the R&D system and its structure, norms in society in general, political system, legal system, level of trust in the government and trust amongst researchers themselves and general willingness to copy best practices from abroad. We suggest that more research in these areas is needed as well as a deep knowledge of the science policies in order to provide relevant narratives.

Ethics

The study procedures were carried out in accordance with the Declaration of Helsinki.

Funding sources

No financial support was received for the study.

Authors' contribution

All authors had full access to all data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis. Aleš Vlk: study concept and design, main editor. Otakar Fojt: study concept and design, interpretation of data. Jiří Stanzel: statistical analysis.

Conflict of interest

The authors declare no conflict of interest.

Acknowledgments

No additional acknowledgements.

References

  • Arnold, E., & Mahieu, B. (2011). International audit of research, development & innovation in the Czech Republic – Synthesis Report. Brighton: Technopolis Group.

    • Search Google Scholar
    • Export Citation
  • Bouckaert, G., & Halligan, J. (2008). Managing performance – International comparisons. London: Routledge.

  • Bouckaert, G., & van Dooren, W. (2003). Performance management in public sector organisations. In E. Löffler, & T. Boivard (Ed.), Public management and governance (127136). London: Routledge.

    • Search Google Scholar
    • Export Citation
  • Český statistický úřad. (2018). UKAZATELE VÝZKUMU A VÝVOJE za rok 2016. Praha: Český statistický úřad.

  • Donovan, C. (2007). The qualitative future of research evaluation. Science and Public Policy, 34(8), 585597.

  • European Commission. (2010). EUROPE 2020. A strategy for smart, sustainable and inclusive growth. Communication from the Commision. COM(2010) 2020.

    • Search Google Scholar
    • Export Citation
  • European Commission. (2018). The 2018 EU industrial R&D investment scoreboard. Available at: http://iri.jrc.ec.europa.eu/scoreboard18.html.

    • Search Google Scholar
    • Export Citation
  • European Research Council. (2018). From mini-organs to ultrafast filming: ERC invests in early career researchers. Available at https://erc.europa.eu/news/mini-organs-ultrafast-filming-erc-invests-early-career-researchers.

    • Search Google Scholar
    • Export Citation
  • Good, B., Vermeulen, N., Tiefenthaler, B., & Arnold, E. (2015). Counting quality? The Czech performance-based research funding system. Research Evaluation, 24, 91105.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gough, M. (2016). Measuring the impact of R&D spending. Available at: https://www.natureindex.com/news-blog/measuring-the-impact-of-r-and-d-spending.

    • Search Google Scholar
    • Export Citation
  • Macilwain, C. (2010). What science is really worth. Nature News, 456, 682684.

  • Meo, S. A., Masri, A. A. A., Usmani, A.M., Memon, A. N., & Zaidi, S. Z. (2013). Impact of GDP, spending on R&D, number of universities and scientific journals on research publication in Asian countries. PloS One, 8(6).

    • Crossref
    • Search Google Scholar
    • Export Citation
  • OECD. (2018). OECD research and development expenditure in industry 2018. Available at: https://www.oecd-ilibrary.org/industry-and-services/oecd-research-and-development-expenditure-in-industry-2018_anberd-2018-en.

    • Search Google Scholar
    • Export Citation
  • Ràfols, I. (2019). S&T indicators in the wild: Contextualization and participation for responsible metrics. Research Evaluation, 28(1), 722.

  • Soete, L. (2017). On the impact of openness on economic growth: An overview of the debate. In European commision: Europe’s future: Open innovation, open science, open to the world (pp. 2228). Brussels: European Commission.

    • Search Google Scholar
    • Export Citation
  • Soete, L., Verspagen, B., & Ziesemer, T. H. W. (2020). The economic impact of public R&D: An international perspective. UNU-MERIT Working paper. Available at: https://www.merit.unu.edu/publications/working-papers/.

    • Search Google Scholar
    • Export Citation
  • Technologické centrum AV ČR (2018). ERC choropleth map. Available at: http://svizualizace.tc.cas.cz/maps/main.html.

  • UNESCO. (2010). UNESCO science report 2010: The current status of science around the world. Paris: UNESCO Publishing.

  • World Bank. (2018). National science foundation, science and engineering indicators. Available at: https://data.worldbank.org/indicator/IP.JRN.ARTC.SC.

    • Search Google Scholar
    • Export Citation
  • Young, M. (2014). Coarsely ground. In J. Brankovič, M. Klemenčič, P. Lažetič, & P. Zgaga (Eds.), Global challenges, local responses in higher education. Higher education research in the 21st century series (pp. 1533). Rotterdam: Sense Publishers.

    • Search Google Scholar
    • Export Citation
2

In this paper, we use the term science policy as an equivalent to research and development (R&D) policy. At the same time, we talk about the R&D system on the national level.

3

For more details see the the European Research Council website at https://erc.europa.eu/.

4

For the 2014–2020 period, the official label is ESIF – European Structural and Investment Funds.

  • Arnold, E., & Mahieu, B. (2011). International audit of research, development & innovation in the Czech Republic – Synthesis Report. Brighton: Technopolis Group.

    • Search Google Scholar
    • Export Citation
  • Bouckaert, G., & Halligan, J. (2008). Managing performance – International comparisons. London: Routledge.

  • Bouckaert, G., & van Dooren, W. (2003). Performance management in public sector organisations. In E. Löffler, & T. Boivard (Ed.), Public management and governance (127136). London: Routledge.

    • Search Google Scholar
    • Export Citation
  • Český statistický úřad. (2018). UKAZATELE VÝZKUMU A VÝVOJE za rok 2016. Praha: Český statistický úřad.

  • Donovan, C. (2007). The qualitative future of research evaluation. Science and Public Policy, 34(8), 585597.

  • European Commission. (2010). EUROPE 2020. A strategy for smart, sustainable and inclusive growth. Communication from the Commision. COM(2010) 2020.

    • Search Google Scholar
    • Export Citation
  • European Commission. (2018). The 2018 EU industrial R&D investment scoreboard. Available at: http://iri.jrc.ec.europa.eu/scoreboard18.html.

    • Search Google Scholar
    • Export Citation
  • European Research Council. (2018). From mini-organs to ultrafast filming: ERC invests in early career researchers. Available at https://erc.europa.eu/news/mini-organs-ultrafast-filming-erc-invests-early-career-researchers.

    • Search Google Scholar
    • Export Citation
  • Good, B., Vermeulen, N., Tiefenthaler, B., & Arnold, E. (2015). Counting quality? The Czech performance-based research funding system. Research Evaluation, 24, 91105.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gough, M. (2016). Measuring the impact of R&D spending. Available at: https://www.natureindex.com/news-blog/measuring-the-impact-of-r-and-d-spending.

    • Search Google Scholar
    • Export Citation
  • Macilwain, C. (2010). What science is really worth. Nature News, 456, 682684.

  • Meo, S. A., Masri, A. A. A., Usmani, A.M., Memon, A. N., & Zaidi, S. Z. (2013). Impact of GDP, spending on R&D, number of universities and scientific journals on research publication in Asian countries. PloS One, 8(6).

    • Crossref
    • Search Google Scholar
    • Export Citation
  • OECD. (2018). OECD research and development expenditure in industry 2018. Available at: https://www.oecd-ilibrary.org/industry-and-services/oecd-research-and-development-expenditure-in-industry-2018_anberd-2018-en.

    • Search Google Scholar
    • Export Citation
  • Ràfols, I. (2019). S&T indicators in the wild: Contextualization and participation for responsible metrics. Research Evaluation, 28(1), 722.

  • Soete, L. (2017). On the impact of openness on economic growth: An overview of the debate. In European commision: Europe’s future: Open innovation, open science, open to the world (pp. 2228). Brussels: European Commission.

    • Search Google Scholar
    • Export Citation
  • Soete, L., Verspagen, B., & Ziesemer, T. H. W. (2020). The economic impact of public R&D: An international perspective. UNU-MERIT Working paper. Available at: https://www.merit.unu.edu/publications/working-papers/.

    • Search Google Scholar
    • Export Citation
  • Technologické centrum AV ČR (2018). ERC choropleth map. Available at: http://svizualizace.tc.cas.cz/maps/main.html.

  • UNESCO. (2010). UNESCO science report 2010: The current status of science around the world. Paris: UNESCO Publishing.

  • World Bank. (2018). National science foundation, science and engineering indicators. Available at: https://data.worldbank.org/indicator/IP.JRN.ARTC.SC.

    • Search Google Scholar
    • Export Citation
  • Young, M. (2014). Coarsely ground. In J. Brankovič, M. Klemenčič, P. Lažetič, & P. Zgaga (Eds.), Global challenges, local responses in higher education. Higher education research in the 21st century series (pp. 1533). Rotterdam: Sense Publishers.

    • Search Google Scholar
    • Export Citation
The author instruction is available in PDF. Please download the file from HERE
 
The Submissions template is available in MS Word.
Please, download the file from HERE
Please, download the file from HERE (For book reviews).

 

 

Senior Editors

Founding Editor: Tamás Kozma (Debrecen University)

Editor-in-ChiefAnikó Fehérvári (ELTE Eötvös Loránd University)

Assistant Editor: Eszter Bükki (ELTE Eötvös Loránd University)

Associate editors: 
Karolina Eszter Kovács (Debrecen University)
Valéria Markos (Debrecen University)
Zsolt Kristóf (Debrecen University)

 

Editorial Board

  • Tamas Bereczkei (University of Pécs)
  • Mark Bray (University of Hong Kong)
  • John Brennan (London School of Economics)
  • Carmel Cefai (University of Malta)
  • Laszlo Csernoch (University of Debrecen)
  • Katalin R Forray (HERA Hungarian Educational Research Association)
  • Zsolt Demetrovics (Eotvos Lorand University, Budapest)
  • Csaba Jancsak (University of Szeged)
  • Gabor Halasz (Eotvos Lorand University, Budapest)
  • Stephen Heyneman (Vanderbilt University, Nashville)
  • Katalin Keri (University of Pecs)
  • Marek Kwiek (Poznan University)
  • Joanna Madalinska-Michalak (University of Warszawa)
  • John Morgan (Cardiff University)
  • Roberto Moscati (University of Milan-Bicocca)
  • Guy Neave (Twente University, Enschede)
  • Andrea Ohidy (University of Freiburg)
  • Bela Pukanszky (University of Szeged)
  • Gabriella Pusztai (University of Debrecen)
  • Peter Toth (HERA Hungarian Educational Research Association)
  • Juergen Schriewer (Humboldt University, Berlin)
  • Ulrich Teichler (University of Kassel)
  • Voldemar Tomusk (Estonian Academy of Sciences, Tallin)
  • Horst Weishaupt (DIPF German Institute for International Educational Research, Frankfurt a.M)
  • Pavel Zgaga (University of Ljubljana)

 

Address of editorial office

Dr. Anikó Fehérvári
Institute of Education, ELTE Eötvös Loránd University
Address: 23-27. Kazinczy út 1075 Budapest, Hungary
E-mail: herj@ppk.elte.hu

2020  
CrossRef Documents 36
WoS Cites 10
Wos H-index 3
Days from submission to acceptance 127
Days from acceptance to publication 142
Acceptance Rate 53%

2019  
WoS
Cites
22
CrossRef
Documents
48

 

Hungarian Educational Research Journal
Publication Model Gold Open Access
Submission Fee none
Article Processing Charge none
Regional discounts on country of the funding agency  
Further Discounts Gold Open Access
Subscription Information Gold Open Access
Purchase per Title  

Hungarian Educational Research Journal
Language English
Size B5
Year of
Foundation
2011
Publication
Programme
2021 Volume 11
Volumes
per Year
1
Issues
per Year
4
Founder Magyar Nevelés- és Oktatáskutatók Egyesülete – Hungarian Educational Research Association
Founder's
Address
H-4010 Debrecen, Hungary Pf 17
Publisher Akadémiai Kiadó
Publisher's
Address
H-1117 Budapest, Hungary 1516 Budapest, PO Box 245.
Responsible
Publisher
Chief Executive Officer, Akadémiai Kiadó
ISSN 2064-2199 (Online)

Monthly Content Usage

Abstract Views Full Text Views PDF Downloads
Jun 2021 0 7 12
Jul 2021 0 11 13
Aug 2021 0 6 10
Sep 2021 0 11 15
Oct 2021 0 10 22
Nov 2021 0 9 18
Dec 2021 0 0 0