Acceso abierto

Public policy instrument evaluation in service of enabling grand strategy discourse – Case of Horizon 2020 key indicators


Cite

Introduction

Relatively successful economic imaginaries have their own, performative, constitutive force in the material world. For their operation presupposes a substratum of substantive economic relations and instrumentalities as their elements; in addition, where an imaginary is successfully operationalized and institutionalized, it transforms and naturalizes these elements and instrumentalities into moments of a specific economy with specific emergent properties. For economic imaginaries identify, privilege, and seek to stabilize some economic activities from the sum of economic relations and turn them into objects of observation, calculation, and governance. Technologies of economic governance, operating sometimes more semiotically, sometimes more materially, constitute their own objects of governance rather than governing already pre-constituted objects (Jessop 1990, 1997). Observig through the aspect of Cultural political economy, when the economic crisis hit the economic imaginary of European Union, the variation in dicsurses led to the common European strategy for smart, sustainable and incluse growth (Europe 2020) was prepared to steer the members of EU to more innovative, educated, cleaner EU. Durring the selection of particular discurses, Horizon 2020 was set as a programme with the aim to boost the EU innovation.

When question the possibility of rational strategic steering in the context of complex modern societies we are dealing with a contradictory situation. On one hand, there is an ever-increasing need for rational societal steering, on the other hand, the very attempt to deal with complexity by means of rational action produces hyper-complexity, thus rendering successful societal steering even less probable. Nevertheless, societies continuously attempt to find the key to two crucial elements of societal steering, i.e. efficient goal setting and control over their implementation (Rončević and Makarovič 2011, see also Makarovič and Rončević 2010, Rončevič 2008, Rončević and Makarovič 2010). Since the world cannot be grasped in all its complexity in real time (see Jessop 2010), actors (and observers) must focus selectively on some of its aspects in order to be active participants in that world and/or to describe and interpret it as disinterested observers (ibid). In that way, the European Comission prepared key indicators (by which one could measure itsown achievements and policy effects) and retended the policy by implementing it in relevant documents (guidelines and action plans) for all to follow.

Horizon 2020 is the biggest EU research and innovation programme ever, with a substantial budget of 80 billion EUR. Besides the importance of transparent budget relocation, the expected effect should be seen not only in innovation boost but a consequently better life for each European. By this research, we do not aim to evaluate the importance and need for grand strategies and policies. We do, however, evaluate its' transparency. We predict the problem of accessibility of data, the actuality of data, and above all comparative data values. How will we know where we are if we cannot make comparative analysis? And even more - how to then set new policies and instruments if we do now know the outcomes and effects of previous ones? How can we assure discourse, if we do not have relevant information to support argumentation? The purpose of the article is to illustrate the problem of public policy evaluation in regards to the availability of the information. By that, we want to warn about the issue of the disabled discourse of relevant social groups and institutions in the EU.

Importance of the H2020 instrument

As Makarovič, Šušteršič and Rončević say (2014, 610-626) the European Union (EU) has been continuously rethinking its global position amidst emerging economic and geopolitical challenges and attempting to formulate strategies to increase its competitiveness (Makarovič, Šušteršič and Rončević 2014, 610-626). Howorth (2010, 455-474) argues, that the European Union, in the wake of Lisbon, has become an international actor and it now faces two major external challenges. (i) to develop a strategic vision for a potentially tumultuous emerging multi-polar world. The European Council's December 2008 ‘Report on the Implementation of the European Security Strategy’ recognized that, over the last five years, the threats facing the EU had become ‘increasingly complex’, that ‘we must be ready to shape events (by) becoming more strategic in our thinking’. (ii) to help nudge the other major actors towards a multilateral global grand bargain. The author further argues (ibid.), that such a bargain will be the necessary outcome of the transition from a US-dominated post-1945 liberal world order, towards a new 21st-century order accommodating the rising powers and sensitive to the needs of the global south. Without such a comprehensive and co-operative bargain, the emerging multi-polar world will be rife with tensions and highly conflict-prone (ibid.).

In that regard, the European Commission started Horizon 2020 in 2014. H2020 is (European Commission, n. d.). the financial instrument implementing the Innovation Union, a Europe 2020 flagship initiative aimed at securing Europe's global competitiveness (ibid). H2020 is seen as a means to drive economic growth and create jobs. As European Commission stated on its webpage, Horizon 2020 (European Commission n. d.) has the political backing of Europe’s leaders and the Members of the European Parliament. They agreed that research is an investment in our future and so put it at the heart of the EU’s blueprint for smart, sustainable and inclusive growth and jobs (ibid.).

“By coupling research and innovation, Horizon 2020 is helping to achieve this with its emphasis on excellent science, industrial leadership and tackling societal challenges. The goal is to ensure Europe produces world-class science, removes barriers to innovation and makes it easier for the public and private sectors to work together in delivering innovation (European Commission n. d.).”

Klisz and Aluchna (2012) argue, that European companies consider Horizon 2020 as one of the most important funding programme of this century. They say, that this funding will also serve to promote international cooperation, which will enhance the attractiveness of the EU's research, and enable the joint addressing global issues and to promote the EU external policy. The analysis of possible synergy with funds awarded under the EU cohesion policy will be made to provide the greater number of participants across Europe. The weaker regions will identify potential centers of excellence, which will propose a strategic advisory and financial support under the program Horizon 2020, and the EU Structural Funds will be spent on upgrading infrastructure and equipment (ibid).

As Renda says (2015, 20 - 24) EU institutions have increasingly shown "performance anxiety" in trying to catch up with the growing gap between the EU and the United States in this field, claiming that Europe was experiencing a true innovation emergency (ibid). The past years of innovation policy have brought some beacons of hope. These are not exclusively related to the amount of money allocated to research and innovation in the EU budget but instead, are mostly due to the emphasis placed on the governance of innovation rather than on the selection of projects based on pre-determined criteria. In this respect, the three pillars of Horizon 2020 – excellent science, industrial leadership, and societal challenges – appear much more in line with the needs of potential innovators and entrepreneurs than past projects like the 7th Framework Programme for Research and the Competitiveness and Innovation Programme for SMEs (ibid).

Horizon 2020 reflects efforts to invest in massive accumulation of laboratories that can perform extremely sophisticated research (Klisz and Aluchna 2012). There is a lot of information about research and innovation implementation in Europe in Horizon 2020, but this idea is a challenge in a time of the economic crisis. This does not mean that funds will be used solely for these disciplines and research (ibid.). The budget of the ERC, which finances basic research, has been increased by 70 percent. This means tremendous support for the funding of theoretical and basic research, which is often determined by the development of an implementation study (ibid).

European Commission states that Horizon 2020 is open to everyone, with a simple structure that reduces red tape and time so participants can focus on what is important. This approach makes sure new projects get off the ground quickly – and achieve results faster (European Commission n. d.). The steering is done by the EU and the rest is left to the beneficiaries (HEI, SME...) and evaluators to find the niche, where to do in-depth projects.

The current form of Horizon 2020 will accumulate funds and investments in very large scientific enterprises (Klisz and Aluchna 2012). An additional benefit is the scientific specialization of the various European regions. The creation of powerful research laboratories will make the regional-focus view on the most innovative research projects in specific fields (ibid.). Authors also make a relevant comment that it is important not to duplicate the same infrastructure in every region, but adapt it to the research capacity, quality of academic staff and also the realistic possibilities of implementation (ibid).

Questioning Grand Strategies

All in all, the promise we get from H2020 grand strategy are very positive and grand in the true word of meaning. But, is steering the field of research and development through grand strategy appropriate? Makarovič, Šušteršič and Rončević (2014, 610-626) argue, that EU long-standing policy implementation deficit is recognized for its grand strategies, including the initially ambitious Lisbon Strategy and are asking if Europe 2020 is set to fail as well? (ibid.) According to the Innovation Union Scoreboard (IUS) indicator (Veugelers and Cincera 2015, 4 – 9), developed by the European Commission in support of its Innovation Union Strategy, Europe is not doing well. Europe's gap relative to the US holds across almost all individual indicators that go into the IUS score. This reflects the systemic nature of Europe's failing innovation capacity. Europe's overall R&D-to-GDP ratio currently stands below two percent, significantly lower than the ratios in the US, Japan, South Korea and Singapore. Furthermore, there are relatively few signs of progress. China is fast catching up and already on par with the EU (ibid).

From the other point of view, Frietsch, Rammer and Schubert (2015, 9-13) say, that the US is still the most important national science and research system in the world, with China quickly catching up – not only in terms of quantity but also in terms of quality. Europe, however, as the largest translational science and research system, is ahead of these national systems. They argue (ibid.) that recent analysis suggests that the European Union as a whole overtook the US concerning the performance of the science system. This is not only due to input factors but also to an increase in output. The US, on the other hand, has continuously lost ground in recent years, as is continuously exemplified by the results of the Innovation Indicator (ibid).

Frietsch, Rammer and Schubert (2015, 9-13) see one of the reasons for this gradual decline can be attributed to the country's science and research policy, which is traditionally designed as non-interventionist and market-conforming and envisions a rather passive role for the state. They say (ibid.) that one result of this policy was that for many years public spending on science and research in the US did not increase at the same rate as it did in most other highly developed countries. Also, the US struggled with the economic and financial crises, and other policy areas had priority over science and research. Even the US economic stimulus package, which envisaged a slightly more active role for science and research policymaking, was only able to produce a flash in the pan, leaving the overall trend hardly affected (Frietsch, Rammer and Schubert 2015, 9-13). One cannot argue against the fact that there is considerable heterogeneity (Frietsch, Rammer and Schubert 2015, 9-14) of the science and innovation systems in Europe, not only concerning their resource endowment but also concerning their efficiency in producing scientific and innovative outputs (ibid).

Taking pros and cons into consideration there is a short but very relevant statement, done by Makarovič, Šušteršič and Rončević (2014, 610-626) saying that strategic steering is essentially a discursive practice influenced by both semiotic and extra-semiotic factors. Hence, the success or failure of a strategy essentially depends on the ability to steer the discourse.

Why measuring is important

EC states, that Horizon 2020 is the biggest EU Research and Innovation programme ever with nearly €80 billion of funding available over 7 years (2014 to 2020). EC also adds significant information that they expect this amount will attract private (see European Commission, n. d.). Both the EC and the EU are aware that studying in Europe can only be financed from public funds. Thus, the system support of the research programs will also require input from industries and what is particularly most important – from local SMEs (Klisz and Aluchna 2012). EC It promises more breakthroughs, discoveries, and world-firsts by taking great ideas from the lab to the market (see European Commission, n. d.).

How has EU Research and Innovation funding evolved over recent years? Recording to Table 1, the budget in the last three frameworks increased extremely.

Framework programmes for research and innovation1984-2020

IDPeriodBudget (billions of €)

Budget is in constant prices.

FP11984–19873.3
FP21987–19915.4
FP31990–19946.6
FP41994–199813.2
FP51998–200214.9
FP62002–200619.3
FP72007–201350.5
FP8 Horizon 20202014–202074.8

Source: EPRS 2015

Klisz and Aluchna (2012) argue, that although the budget seems to be extremely high, it's only one-third what China is planning to invest in R&D. New re-defined European research and innovation are the EU various investments which will become the foundation for the development of the region, which is now plunged into a deep crisis. The new dimension of R&D will become part of the stimulation of the European market. This has a much better impact, comparing with pumping more funds in the failing banks and public budgets (ibid.). The authors stress, that Horizon 2020 is not created for patching holes, but to increase development and stimulation of the region and it's designed to combine the cooperating organizations to improve the EU position in the highly competitive global market (ibid.).

Whit all the expected outcomes and long-term effects one still needs to keep in mind, that it is a programme with a lot of financial stakes and uncertain outcomes. How to make it as successful as possible? Makarovič, Šušteršič and Rončević (2014, 610-626) say that in the case of the Lisbon Strategy communicating as such has been a part of communicating the EU. Segmental units such as local communities, nation-states as well as sub- and supranational regions are not some pre-given "natural" entities but social systems produced and reproduced only through communication. EU is no exception: it exists as long as it is able to communicate itself since it—just like its nation-states (although they may seem somewhat more "given" to common sense)—only exists as communication. The Lisbon Strategy has contributed to this communication and so may the future strategies through their success or even through another failure (Makarovič, Šušteršič and Rončević 2014, 610-626).

Following the above statement, one can agree, that if the EU is to develop the ability to meet the challenges with a strategic approach (Makarovič, Šušteršič and Rončević 2014, 610-626), repeating current communicative processes will not be sufficient. Authors suggest (ibid.) that for Europe 2020 or any following economic imaginary, it is vital that the EU develops more efficient mechanisms of "retention" of selected discourses, as well as selective "recruitment", "inculcation and retention" by relevant social groups and institutions. Without this, no economic imaginary can become successful. Success or failure of a strategy essentially depends on the ability to steer discourse (ibid). If the discourse is a deal-breaker for success or failure of a strategy, measures should be set to enable the discourse. Is the money well spent? Where the amounts sufficient? Are the funded fields relevant? Does it strengthen EU innovation, economy, sustainability, better life? How are regions dealing with the policy? This and more of similar questions can be relevantly discussed if we have relevant data to support the argumentation. European Commission itself says, that (European Commission 2015) monitoring generates data on an intervention's activity and impact over time continuously and systematically. It helps identify and address any implementation problems of an intervention at the same time as it generates factual data for future evaluation and impact assessment (ibid.). Further on, the EC agrees, that evaluation takes a broader look (ibid.) at all aspects of performance, looking more at “whether” the changes and any movement towards the set objectives are due, at least in part, to the intervention and “why” an intervention has been more or less successful in achieving its policy objectives. It looks at what has happened, why something has occurred and in particular how much has changed as a consequence (ibid.).

Public policies set their own key indicators with the aim to measure the outcomes and effectiveness of the policy. In that regard, European Commission says, that (European Commission 2015) assessing the impact of Horizon 2020 on growth and jobs through indicators at project and programme level, including in terms of its efficiency and quality, is a challenge. Reliable indicators of results and impacts are limited, the importance of individual indicators varies by discipline and sector, and there can be a considerable time lag between inputs and outputs (ibid.). However, at this point it is also important to decide, what do we want to measure and evaluate when referring to H2020. Do we want to know how countries and regions are doing among selves and discuss region capabilities, or do we (as EU members) compete together, namely EU vs USA China, Japan, etc?

H2020 will have an important effect on EU countries. Frietsch, Rammer and Schubert (2015, 9-13) said, that they expect that if the Horizon 2020 programme follows its rhetoric and focuses on funding world-class researchers and disruptive research in enabling industrial technologies, some member states will benefit more from these investments than others. Therefore, Horizon 2020 will likely increase the heterogeneity of innovation systems in Europe, while its impact on growth and jobs will hardly target those countries that would need them most urgently. They also expect, (ibid.) however, that the aims of excellent research, increased growth and job creation are better attainable by Horizon 2020 across the whole of Europe by foregoing the goal of reduced heterogeneity. In a longer perspective, a general upgrade of science and innovation and an increase in the number of research and innovation-oriented member states is a worthwhile pathway. As such, this new approach in Horizon 2020 is especially promising, even for the countries that are less oriented towards science and innovation (ibid).

The statements written above need to be taken into consideration especially when talking about equal development inside the EU. Never the less, for the purpose of this article we will research key indicators for H202 Industrial leadership and follow the aim, set by European Commission: "... securing Europe's global competitiveness…" (European Commission n.d.), meaning we will look into the set key indicators and evaluate if they are relevant for comparative analysis EU vs. other industry-leading countries. By this research, we do not aim to evaluate the importance and need for grand strategies and policies. We do, however, evaluate its’ transparency. We predict the problem of accessibility of data, the actuality of data, and above all comparative data values. How will we know where we are if we cannot make comparative analysis? And even more - how to then set new policies and instruments if we do now know the outcomes and effects of previous ones? How can we assure discourse, if we do not have relevant information to support argumentation?

Evaluation through the eyes of policy analysis

As written above, to be able to discuss and to steer the discourse one needs relevant information. To assure relevant information, policy analysis is a suitable tool. The aim of the policy analysis is (in the case of the applicative type of policy analysis) (Fink - Hafner and Lajh 2007, 20-21) to formulate a recognition on a relatively narrow problem, issues that are as directly applicable in the processes of the political decision - making. It differs from the academic policy analysis by its length and problems since the academic type of policy analysis deals with the theory and with "big questions" and is of more explanatory nature (ibid.).

Public policy can be defined as a long series of more or less related choices - along with with the decisions of non-action taken by government bodies and officials (Dunn 1994, 61). The processes of designing and implementing public policies are empirical processes (Fink - Hafner 2007, 17). Several theoretical models of public policy analysis have been developed to help clarify the empirical decision-making processes on public policies, the characteristics of these processes and their effects. In the literature, the following models are most often presented: institutional theory (intuitionalism), rational theory (rationalism), Incremental theory (incrementalism), Mixed scanning, Process theory, Group theory, Elite theory, Game theory and Public choice theory (Fink Hafner 2007, 33-34). For the understanding “of” science (Fink - Hafner 2007, 17) and "in” science regarding public policy, the public policy process is a very model, which is based on the understanding of the process of shaping public policies as a sequence of temporally separated and substantially different phases:

identification of public policy issues and the creation of a political agenda;

the formation of public policy alternative solutions to the social problem;

legalization of the chosen public policy solution;

Implementation of public policy;

Evaluation of public policy effects (Fink - Hafner 2007, 17).

Public policy actors have public policy mechanisms, (instruments, tools, techniques and measures) to influence the processes of policy design and implementation (Pal, 1987). The most obvious (Kotar 2002, 53) is the special position of state structures as decision-makers in policy-making since they have the sole legal powers, privileged use or access to public policy mechanisms. In particular, the possibility of state actors is exemplified by defining the decision-making process or by deferring the agenda of solving public problems. Non-state public-political actors do not have such power in institutionalized decision-making on public policies (ibid).

According to Majchrzak (1984), the relevant public policy mechanisms, concerning the discussed issues, are the following:

- mechanisms related to information: the production of information through the collection, display, evaluation and monitoring of information; grouping information; disseminating information through reports, conferences, etc.; stimulating interests through publicity, propaganda, intimidation and threats; retention of information; templates of model legislation;

- financial mechanisms: investments, compensation of losses; resource redistribution; setting financing priorities, etc.

- regulatory and control mechanisms: regulatory provisions; legislation; setting standards; granting monopolies, etc.

- operational mechanisms: construction and management of facilities; public works (eg construction), etc.

- mechanisms that are directly linked to public policies: the creation of a political agenda; defining priorities and objectives; postponing decisions; coordinating public policies (Majchrzak 1984, 26).

Bayers (2004, 213-216) defines two impact strategies: voice and access. The voice strategies refer to public policy strategies, such as media campaigns or protests. They represent activities in various public spheres, an arena where communication between social groups, policy-makers and citizens becomes visible to the general public. We separate the information policies and the policy of protest. Both represent the presentation of information in a particular strategic point, except that others are conceptually different, contain an explicit presentation of events to attract attention and expand the conflict. They are easier to distinguish according to the actions used by players in the context of these two strategies, namely: - information policies: organizing press conferences, informing in the form of brochures, leaflets, participating in debates in the media, involving known persons in campaigns, etc.;

- protest policy: organizing manifestations / demonstrations, street actions, petitioning, civil disobedience / disruptive activities (Bayers 2004, 226).

Access strategies are synonymous with lobbying (Bayers 2004, 213-216). They concern a surrounding where political bargaining takes place. They are exchanging publicly-relevant information with civil servants through formal or non-formal networks. Contrary to the voice strategy, access strategy transforms information directly from stakeholders to policymakers. They are mainly used for the transmission of operational and technical information (ibid). In practice (Bayers 2004, 213-216) it is difficult to distinguish between various manifestations of political mobilization, which can be grouped in the overall strategy of influence. The choice and combination of tactics are shaped by two possible obstacles. The first is the costs and benefits associated with different strategies, the second being the position of the playing structure or the actual gain. The author (Bayers 2004, 213-216) says that to understand the changing of the use of political strategies, analysts have hypothesized that specific interests’ useless voicemail and are looking for more accessible than diffuse interests. The difference between diffuse and specific interests relates to the interests of voters, which made mobilization, in fact, a reality (Bayers 2004, 213-216).

Evaluation research of public policies is an integral part of the scientific research methodology, and on the other hand, it is different and very specific and never completely methodologically objective, but precisely for this reason exceptionally is varied and complex (Kustec-Lipicer 2009, 22-23). Evaluation research is a set of research procedures, targeted research that enables the acquisition and presentation of valuable assessments of the studied public policy content (Kustec-Lipicer 2009, 117). It must be designed specifically for each case studied, including the contextual factors that are relevant to this case (Bressers, van Twist and ten Heuvelhof 2013, 23-37).

Evaluation (Parsons in Kustec-Lipicer in Fink Hafner 2007, 177) has a particular role in the life process and/or public policy cycle in two key points: - in the phase before the formal adoption of a public policy that provides a balanced evaluation of all potential alternative solutions and their effects (ex-ante evaluation),

- in the phase that follows the implementation of the already adopted public policy, when it is necessary to collect and evaluate the actual effects caused by the adopted public policy (ex-post evaluation) (ibid.).

Different views and approaches can also be seen through four larger groups (see Kustec-Lipicer 2009, 81-113), which distinguishes:

- Time series (It is based on the expectation of the initiators that the performers are properly evaluated in the various periods before the implementation of public policy, between and after the implementation)

- Performance category (Its key content refers to the question of entities or groups that evaluate the phenomena studied, who are those entities, what purposes and objectives do they have, and from which the positions evaluations is performed,

- Content category (The most important is the motivator's motive - to improve and develop public policy through evaluation, and to pronounce judgment on it.) and

- Methodological type of evaluation of public policies (ibid.).

The categories do not exclude each other, but are complementary or at least they intertwine (ibid.). For the purpose of this article, we will evaluate public policy instrument H2020. We will not evaluate indicators set by the policy instrument documents per se. The evaluation will cover the availability of the data, that should enable interim and ex-post evaluation.

Key indicators and their evaluation

Indicators are used by people in everyday life, and especially are providing the basis of companies or governments decisions (Cornescua and Adamb 2013). Indicators are used not only by researchers and stakeholders but also by civil society to better understand specific interests. Cornescua and Adamb (3013) say, that indicators are used by people for daily decisions. People voluntarily or involuntarily, are always using indicators when they analyze, forecast and so on. Its importance is given by the fact that indicators are describing a topic of interest, reducing information overload for data users and provides the necessary information for decision-making. The power of the indicators it represents also a weak point because when it is intended to describe a wider topic of interest, the selection of one or more representative indicators is difficult, there is a loss of information risk or manipulation of obtained data (Cornescua and Adamb 2013).

From the literature (see Cornescua and Adamb 2013) we identify several important features that a relevant indicator should meet:

be specific - to clearly identify the results;

be measurable - preferably to be quantitative;

be practical – that can be used;

be available - allowing the necessary data collection for indicator;

be transparent in methodology and selection;

be well-grounded in scientifically (ibid).

Since there are 28 member states in EU, semiotics in setting key indicators for all to understand and follow in the same way in necessary. European Commission (see European Commission, n. d.) set measures to complete and further develop the European Research Area by the EU Framework Programme for Research and Innovation. These measures aim at breaking down barriers to create a genuine single market for knowledge, research and innovation (ibid). European Commission also stated, that despite the complexity, the Horizon 2020 indicators will deliver information on outputs and results across all areas of the programme. They will provide the basis for analyzing the nature and scale of impact of Horizon 2020 on the European research and innovation system and how Horizon 2020 has contributed to building a society and an economy based on knowledge and innovation across the Union by leveraging additional research, development and innovation funding (EC DGRI 2015).

The legal basis of Horizon 2020 specifies a list of compulsory Key Performance Indicators to be considered in its evaluation and monitoring system (EC DGRI 2015). The legal basis also indicates a list of 14 cross-cutting issues that serve to monitor on an annual basis Horizon 2020 programme implementation and which are reported in the Annual Horizon 2020 Monitoring Report (ibid). Key Performance Indicators are divided into three sections: Excellent Science, Industrial Leadership and Societal Challenges. For the purpose of this article, we will look into Industrial Leadership performance indicators. Also, for the purpose of this article, we will make a general search of the indicators. We do need to take into the account that there are data not available for the general public which enables a different kind of calculations. Never the less, general data to evaluate the indicators should be broadly available. There are several sources, offered by the European Commission to track data.

The Horizon Dashboard is a webpage, that provides monthly, aggregated data (see EC n. d.):

Implementation figures - it presents an overview on evaluated proposals (incl. success rates) and detailed statistics and data on funded projects and their participants, broken down by countries and regions, research domain/programme part, organization type, etc.

Country Profiles – where one can find out more about how a country is performing in Horizon 2020: funding received, participations by region, top beneficiaries, collaboration with other countries, SMEs participation and many more information at your fingertips

Project Results- presents information on results of funded projects, notably Intellectual Property Rights (IPRs) and scientific publications (ibid.).

Expecting to find all key indicator results, one can find only two, namely IPRs and scientific publications, the last one not containing additional information on public-private publications. In 2017 Interim evaluation of Horizon 2020 was published. The evaluation assesses (see EC DGRI 2017) Horizon 2020's current progress toward its objectives. The findings are to contribute to drafting the last Work Programme for 2018-2020, provide the evidence-base for the report of the High-Level Expert Group on maximizing the impact of EU R&I programmes and will inform the design of future Framework Programmes (ibid.). The document itself is well structured, offers different data and data explanations. It addresses all the Key indicators with more or less up to date data. But there is one general remark: The data is available for EU. Data per country or comparison with non-EU countries is not available. We break down the information on key indicators below. Also, we have noticed that most of the data for selected indicators were provided by CORDA. CORDA is the European Programme (see CORDA) for the establishment of a European capacity for Earth Observation. Copernicus products are created using satellite imagery and in situ data which is defined as all non-space-born data with a geographic dimension, including observation data from ground-, sea- or airborne sensors as well as reference and ancillary data licensed or provided for use in Copernicus (ibid.). It does not have much to do with industry and innovation per se, however, if they can provide spot-on data, one should not have any concerns. There is one fact that that does not imply good transparency. The general public does not have insight into the database. Their webpage states: “If you are not a CORDA Data Provider please note that due to data policy issues CORDA is currently only available to Copernicus services and their contractors” (CORDA).

Searching through the European Commission webpages one can also find European Innovation Scoreboard 2019. The annual European Innovation Scoreboard (EIS) provides a comparative assessment of the research and innovation performance of the EU Member States and the relative strengths and weaknesses of their research and innovation systems (EC DGRI 2019). It helps the Member States assess areas in which they need to concentrate their efforts to boost their innovation performance. Innovation performance is measured using a composite indicator – the Summary Innovation Index – which summarizes the performance of a range of different indicators. The EIS distinguishes between four main types of indicators – Framework conditions, Investments, Innovation activities, and Impacts – and ten innovation dimensions, capturing in total 27 indicators (ibid.).

EIS (EC DGRI 2019) provides a comparative assessment of the research and innovation performance of the EU Member States and selected third countries and the relative strengths and weaknesses of their research and innovation systems. It helps countries assess areas in which they need to concentrate their efforts to boost their innovation performance (ibid.). The document contains not only the data but also methodology, where the data was collected, the definition of indicators, etc. The main measurement framework for the European Innovation Scoreboard was significantly modified in 2017 (EC DGRI 2019). For the 2019 edition, no changes have been made to the main measurement framework. However, due to data revisions for some indicators, the results for earlier years in this report are not directly comparable to those reported in previous editions of the EIS (ibid.). The changes were made by following a need for additional contextual analyses to better understand performance differences between the innovation indicators used in the main measurement framework, a set of contextual indicators was introduced to the country profiles in the 2017 edition and revised in the 2018 edition (ibid,). Thoe the comparison to previous reports cannot be made, the changes made can only be marked as positive, because they enable better understanding. What we do miss in this document is an alignment with H202 key indicators. There are only four of them, listed in the document: Venture capital, SMEs introducing product or process innovations, Public-private co-publications per million population and. PCT patent applications per billion GDP. Further on we describe where we found data for the indicators and to what extent.

Patent Applications

The data for Patent Application in the Interim evaluation of Horizon 2020 (see EC DGRI 2017, page 132) is available for EU. Data per country or comparison with non-EU countries is available only by the year 2013. Searching the EUROSTAT database gives us poor statistic on the patent application (Eurostat 1). The numbers for the EU are up to date, however, the comparison with other countries is not possible. Other data is available up to the year 2014 (we are in 2019 now), Russia and Japan in 2013, and there are no data for China

For more information visit EUROSTAT 1.

. There is, however, another source one can get information from – WIPO, World Intelectual Property Organization. One cannot search the metadata, but there are two, up to date publication available, that give us information, that can answer the question to our Patent Application indicator: Patent Cooperation Treaty Yearly Review 2019 (WIPO 2019) and World Intellectual Property Indicators 2018 (WIPO 2018). The Patent Cooperation Treaty Yearly Review 2019 (WIPO 2019) offers us interesting statistics on the international phase: PCT applications, Global trends in PCT applications, PCT applications by receiving office, PCT applications by origin, PCT applications by applicant type, Top PCT applicants, PCT applications by fields of technology, even Participation of women inventors in PCT applications and may other (see WIPO 2019). The data in World Intellectual Property Indicators 2018 (WIPO 2018) goes up to the year 2017, but the breakdown of data is detailed, enables national comparison and very graphic (see WIPO 2018).

Private Companies Introducing Innovations

The data for Private Companies Introducing Innovations in the Interim evaluation of Horizon 2020 (see EC DGRI 2017, page 153) is available for EU. Data per country or comparison with non-EU countries is not available. Source of data was Corda, to which database we do not have access to. We found an additional source in OECD. The data enables comparison by country, but it is only up to 2014 (see OECD 1).

Joint Public-Private Publications

The data for Joint Public-Private Publications in the Interim evaluation of Horizon 2020 (see EC DGRI 2017, 132) is available for EU. Data per country or comparison with non-EU countries is not available. Searching through H202 dashboard one can find information on scientific publications by year, priority and comparison to other countries. But one cannot find information on Joint Public-Private Publications. The H202 dashboard captures data on publications from the Scimago Journal & Country Rank (see SJR). The website offers a broad database, rankings by Journal or by Country. It is possible to filter data by Subject areas, Subject categories, Regions in years from 1996 up to 2018. There is, however no information about Joint Public-Private Publications. To broaden the search, we have run the keywords on google, trying to find other sources, but we were not successful. At last, it was found in the European innovation scoreboard 2019. In the Annex of the document, one can read, that the data source was Eurostat, but in the main text it is written, that the data was provided by Scopus, Science – Metrix as part of a contract to the European Commission (see EC DGRI 2019).

Venture Capital Investments

The data for Venture Capital Investments in the Interim evaluation of Horizon 2020 (see EC DGRI 2017, page 141) is available for EU. Data per country or comparison with non-EU countries is not available. Eurostat offers data on Venture Capital Investments only for the year 2015. The data is available for only several EU countries (EUROSTAT 2). OECD offers data on Venture capital up to the year 2017 by country, by Start-ups and Later stage venture (see OECD 1). Also, European innovation scoreboard 2019 offers the same data (EC DGRI 2019).

Debt Financing

The data for Debt Financing in the Interim evaluation of Horizon 2020 (see EC 2017, page 143) is available for EU. Data per country or comparison with non-EU countries is not available. OECD offers data on Financial corporations’ debt to equity ratio by country, by year, up to 2017 (see OECD2).

Number of organizations funded

The data Number of organizations funded in the Interim evaluation of Horizon 2020 (see EC DGRI 2017, page 141) is available for EU. The data is available from 2014 – 2016. Data on the number of organizations funded is also available on the H2020 Dashboard. Data per non-EU countries is not necessary.

SMEs that have introduced innovations to the company or to the market

The data on SMEs that have introduced innovations to the company or to the market in the Interim evaluation of Horizon 2020 (see EC DGRI 2017, page 153) is available for EU. Data per country or comparison with non-EU countries is not available. European innovation scoreboard 2019 has the data available for the year 2016. It enables comparison with other countries. In the Annex of the document, one can read, that the data source was Eurostat, but in the main text it is written, that the data was provided by OECD (see EC DGRI 2019).

Turnover of company, Number of employees

There is no concrete data on SME - Growth and job creation in participating SMEs or Turnover of company or Number of employees in the Interim evaluation of Horizon 2020. Eurostat shares similar data on SMEs up to the year 2015, but not the exact indicators (see Eurostat 3). European innovation scoreboard 2019 includes data on Employment fast-growing enterprises of the innovative sector and Employment in knowledge-intensive activities (see EC DGRI 2019).

Horizon 2020 Industrial Leadership Key Performance Indicators

Key performance indicatorDefinition of the indicatorWHERE TO FINDYear of data availableEnables comparison with non-EU
1LEIT

LEIT - Leadership in enabling and industrial technologies

– Patent applications and patents awarded in the different enabling and industrial technologies
Number of patent applications by theme; Number of awarded patents by themeInterim evaluation of Horizon 2020; WIPO2017Yes
2LEIT – Percentage of participating firms introducing innovations new to the company or to the market (covering the period of the project plus three years)The percentage of private companies introducing innovations in the total number of project participants validated as private companiesInterim evaluation of Horizon 2020; OECD2017No/Yes
3LEIT - Number of joint public-private publicationsNumber and percentage of joint public- private publications out of all LEIT publicationsInterim evaluation of Horizon 2020; European innovation scoreboard 2019/Scopus/Science-Metrix2018yes
4Risk Finance - Total investments mobilised via debt financing and Venture Capital investmentsTotal investments mobilised via Venture Capital investmentsInterim evaluation of Horizon 2020; OECD; European innovation scoreboard 20192017yes
5Risk Finance - Total investments mobilised via debt financing and Venture Capital investmentsTotal investments mobilised via debt financingInterim evaluation of Horizon 2020; OECD2017yes
6Risk Finance - Number of organizations funded and amount of private funds leveragedNumber of organizations funded; Amount of private funds leveragedInterim evaluation of Horizon 2020, H2020 Dashboardyesnecessary
7SME – Percentage of participating SMEs introducing innovations new to the company or the market (covering the period of the project plus three years)Number and % of participating SMEs that have introduced innovations to the company or to the marketInterim evaluation of Horizon 2020; European innovation scoreboard 2019/Eurostat, Community Innovation Survey/OECD2016yes
8SME - Growth and job creation in participating SMEsTurnover of company, Number of employeesEurostat2015no

Source: EC DGRI 2015, authors contribution

Conclusion

If we take a look to the list of what features relevant indicator should be met (see Cornescua nad Adamb 2013) we can conclude, that H2020 Key Indicators are specific, are measurable, are practical (can be used). When collected, they are transparent in methodology and selection and well-grounded. We do, however, have problems with availability. According to Table 2, we were successful to assure all the relevant information. How we managed to retrieve them is another story. EC document Interim evaluation of Horizon 2020 has solid, well-structured data, answering all the key indicators, but does not enable comparison with non-EU countries. Also, some data was outdated. Due to lack of information, we had to do an extra search in OECD, Eurostat, H202 Dashboard etc. It took a lot of time and extra knowledge. When preparing Interim evaluation of Horizon 2020 they had to use several different sources as well: Monitoring reports of H202 and statistical data, from internal IT tools, Eurostat and OECD. Extensive analysis was carried out by the responsible Commission service on the different programme parts of H202, external evaluations, participants network, internal assessment, etc. (see EC DGRI 2017). This kind of methodology cannot show transparent access to information for the general public. Never the less it could be solved, if the report would be published each year.

Since we do not believe the methodology will change by the end of the instruments' life, we should find another possibility to enable and support the discourse. One can find it in the European Innovation Scoreboard. It does not answer all H2020 Key Indicators, but it does have many other attributes: it measures innovation performance and trends, offers benchmarking innovation performance with non-EU countries, shows expected short-term changes in EU innovation performance and sets country profiles. In addition to that, it is published each year.

As important as the discourse and its continuing variation is, this evaluation also shows, that institutions of economic imaginary need to be careful with their retention. Eventhoe we want to avoid complexity, one needs to have in mind, that there are discourses, that can overlap because of their nature, because of the need or because they can offer each other more clarity. Consistency and transparency are not needed only throughout the different policies and strategy goals but also throughout their retention to assure the set goal.

eISSN:
2463-8226
Idioma:
Inglés
Calendario de la edición:
Volume Open
Temas de la revista:
Social Sciences, Sociology, Culture, other, Political Sociology, Psychology