1. bookVolume 4 (2019): Issue 1 (February 2019)
Journal Details
License
Format
Journal
eISSN
2543-683X
First Published
30 Mar 2017
Publication timeframe
4 times per year
Languages
English
Open Access

Measuring Scientific Productivity in China Using Malmquist Productivity Index

Published Online: 21 Feb 2019
Volume & Issue: Volume 4 (2019) - Issue 1 (February 2019)
Page range: 32 - 59
Received: 29 Jun 2018
Accepted: 04 Dec 2018
Journal Details
License
Format
Journal
eISSN
2543-683X
First Published
30 Mar 2017
Publication timeframe
4 times per year
Languages
English
Abstract

This paper aims to investigate the scientific productivity of China’s science system.

This paper employs the Malmquist productivity index (MPI) based on Data Envelopment Analysis (DEA).

The results reveal that the overall efficiency of Chinese universities increased significantly from 2009 to 2016, which is mainly driven by technological progress. From the perspective of the functions of higher education, research and transfer activities perform better than the teaching activities.

As an implication, the indicator selection mechanism, investigation period and the MPI model can be further extended in the future research.

The results indicate that Chinese education administrative departments should take actions to guide and promote the teaching activities and formulate reasonable resource allocation regulations to reach the balanced development in Chinese universities.

This paper selects 58 Chinese universities and conducts a quantified measurement during the period 2009–2016. Three main functional activities of universities (i.e. teaching, researching, and application) are innovatively categorized into different schemes, and we calculate their performance, respectively.

Keywords

Introduction

In the last forty years, China has made great efforts to improve the performance of its science system and to build an excellent higher education system. In August 2015, the National Science Library of Chinese Academy of Sciences (LAS, CAS) published an official report named “Blue Book of Basic Research Competitiveness for China in 2015” (Research group of competitiveness analysis of Chinese basic research of National Science Library, 2015) and announced that the number of literature published in Science Citation Index (SCI)/Social Sciences Citation Index (SSCI) journals by China reached 216,000 in 2013, which equals to about 62.2% of that in the United States at the same year. In several disciplines, e.g. Environmental Biotechnology and Chemical Engineering, China even outperformed the United States in terms of SCI publications. Furthermore, the number of citations in China also increased significantly from 2008 to 2012, which accounts for about 6.6% of that in the United States in 2008 and reached 12.4% in 2012.

Two reasons may contribute to the tremendous achievements in Chinese publications and citations. On the one hand, increasing resources were invested in the Chinese science system since the 21st century. OECD (2018) shows that China’s Gross domestic Expenditure on Research and Development (GERD) in 2016 was ten times as much as that in 2000. Concretely, the GERD in China was USD 412 billion in 2016, compared to USD 464 billion in the United States, USD 104 billion in Germany, and USD 150 billion in Japan. Figure 1 shows the GERD of several countries from 2001 to 2016. On the other hand, the state has reorganized the

Figure 1

GERD of several countries from 2001 to 2016 (millions in 2017 constant USD). Note: The data of China doesn’t include the statistical data of Chinese Taiwan.

science system toward a more competitive one and thereby made more efficient use of the existing resources. For example, the Chinese government has gradually increased the autonomy of the universities with respect to degree awarding and subject setting. Besides, the education authorities have created a more competitive atmosphere in the resource allocation of the science system by dramatically increasing the proportion of competitive research funding. Its approval highly depends on the performance evaluation results. This kind of competition exists not only in the resource distribution from the state to the universities but also inside universities.

The existence of fierce competition helped to exploit the performance potential of the Chinese science system, but it is noticeable that over the past decades the implemented incentive programs were heavily biased towards research activities. As the main body of the science system, the universities should balance the three main functions, teaching, researching, and serving the society (Schmoch & Schubert 2009; Schmoch et al., 2010). Therefore, it is necessary to evaluate the evolution of Chinese scientific productivity and figure out whether the considerable progress was attributable to the establishment of the fierce competitive research system at the expense of the efficiency of teaching activities.

Higher education efficiency, as a public economic issue, has been highly valued by the government and society and investigated by many researchers all over the world. However, previous literature mainly focus on the determination of the systematic efficiency, the subsequent ranking or classification of decision-making units (DMUs) and the impact measurement of the external factor on the efficiency scores. Less research concern about the different performance of three functions inside the higher education system and the special educational resources distribution system in China. See details in the literature review of higher education efficiency in Subsection 3.1.

To address these issues, this paper first provides an overview of Chinese primary science policies since the end of the Cultural Revolution and the current management patterns in universities. The quantitative assessments on the productivity evolution of a sample of the 58 universities and their three main functions during 2009–2016 are conducted using a Malmquist productivity approach based on Data Envelopment Analysis (DEA). Some policy implications and suggestions for the Chinese education authorities are also put forward based on the results analysis.

The rest of the paper is organized as follows. Section 2 summarizes the science policies since the end of the Chinese Cultural Revolution and the prevailing managerial patterns inside Chinese universities, including the resource allocation and reward systems. Section 3 presents the previous research on higher education efficiency and introduces the MPI with its decompositions. Section 4 describes the empirical results of Chinese universities. Section 5 discusses the policy implications. Section 6 concludes this paper with some discussions.

Overview of the Chinese science system
The evolution of Chinese science policies

The Cultural Revolution in China was a period that can be characterized as “order out of chaos”, which had a tremendous negative impact on all aspects of society. Since the end of the Cultural Revolution, China’s political orientation has undergone tremendous changes, and science and education policies also improved in all aspects.

In 1977, the Chinese college entrance examination system restored, which had been interrupted since the impact of the Cultural Revolution for more than ten years. Thus, China resumed its respect for knowledge and talents and ushered in new development opportunities in the field of science and technology. In the same year, the Chinese President Xiaoping Deng declared that China should consider science, technology and education as the primary tool to catch up with the developed countries at the National Work Symposium on Education and Science. In 1983, this new doctrine was reemphasized by Deng, and he announced that education should be oriented toward modernization, the world and the future.

Following this general guideline, the first step towards reorganizing the science system was taken in 1985. The central government issued “The Decision of the Communist Party of China Central Committee on Education Reform” and decided to increase the autonomy of universities. A certain autonomy on the student admission was granted to the universities, which used to be fully determined by national plans. Furthermore, in the same year, the central government also issued the “Notice on the Research and Pilot of Assessment of Higher Engineering Education”, which started the pilot of education assessment in China. This policy shift was furthered by the “Interim Provisions on the Evaluation of Regular Institutions of Higher Education” issued in 1990, which tried to guide the universities to train students, conduct scientific research and promote societal development. It was the first legislation on higher education assessment in China, which clearly regulated the basic elements of higher education assessment, including its purpose, mission, guiding ideology, basic forms, etc. Subsequently, the central government started organizing a higher education assessment in universities, and the assessment was mainly conducted on teaching activities.

Since 1992, governments at all levels made considerable efforts to strengthen the construction of China’s scientific system. On the one hand, they announced that the development of education and science was the key to the overall development of China, and the education and scientific activities were guided accordingly. On the other hand, they improved the science and education system by increasing the autonomy of the scientific system and thereby improve the productivity and creativity of scientific activities. Although there were still many restrictions on the autonomy of China’s scientific system compared with the developed countries at that time, it was still a breakthrough in China’s traditional science and education system under the strict top-down bureaucratic control.

In the subsequent years, the overall scale of the science system increased dramatically, which was especially reflected in the rapidly increased overall number of college students. Since 1998, more and more Chinese universities obtained the right to expand the scale of enrollment, so that an increasing number of high school graduates got access to colleges and universities. In January 1999, the Ministry of Education (MOE) announced the “21st Century Education Revitalization Action Plan”, which defined the goal of expanding students as “China’s higher education enrollment rate should be increased from the current level of 9% to about 15% in 2010”. Table 1 shows the rapid expansion of Chinese higher education from 1998 to 2007. It is obvious that the enrollment number, which denotes the number of newly enrolled students in Chinese colleges and universities, increased by more than 4 times from 1998 to 2007. The university number, which refers to the number of Chinese colleges and universities, nearly doubled to the original number in 1998. The enrollment rate, which refers to the ratio of the number of higher education students and the school-age population, increased from 9.8% to 23.0%. The rapid growth of these data shows China’s emphasis on higher education and the positive impact of implementing various science policies.

The expansion of Chinese higher education from 1998 to 2007.

Year1998199920002001200220032004200520062007
Enrollment number108.4159.7220.6268.3320.5382.2447.3504.5546.1565.9
(Units: 10 thousand)
University number (Units: pcs)1,0221,0711,0411,2251,3961,5521,7311,7921,8671,908
Enrollment rate (Unit: %)9.810.512.212.915.017.019.021.022.023.0

Data source: China Education Statistical Yearbooks.

This overall increase was accompanied not only by a large number of resources devoted to this field but also by an increasing need for the efficiency and productivity of resource utilization in the science system. To achieve this goal, the country’s administrative controls were further reduced to release the redundant restrictions on universities. In 2013, the central government decided to gradually reduce the administrative control of universities and promote their autonomy. In January 2014, the MOE issued the “Regulations of Academic Committee of Higher Education Institutions”, which prescribed that Chinese universities should establish an academic committee according to the law to be the core of the academic management system and organization structure. At the same time, the role of competition was emphasized on both individual and organizational levels to align the incentives in the science system with societal needs. Thus, like in most other science systems in developed countries, resources are increasingly awarded based on performance.

In recent years, a widespread phenomenon occurred in China’s science and education system. The incentive mechanism is extremely biased towards scientific research rather than teaching. Most of the improvement in China’s scientific productivity comes from scientific research at the expense of teaching activities as we will describe below. Therefore, although China’s policymakers promote equal importance to the development of scientific research and teaching of students, more and more education practitioners consider producing scientific achievements rather than improving the teaching skills as a career goal. To gain a deeper understanding of the problems in China’s scientific system, we will review the resource allocation and the incentive mechanisms that have been developed from the state to universities as well as from universities to their employees (researchers).

Resource allocation and reward systems in Chinese universities

Since 1993, the state has formulated a series of educational programs aimed at improving the structure of China’s science system and improving the overall competitiveness of Chinese universities at the same time. According to these programs, the Chinese government allocated a large number of resources to a selected number of universities and guided these universities on their way towards becoming internationally competitive.

Among the numerous projects, Project 211, Project 985, and Plan 2011 are the three most influential programs. The details of these national education programs are described as follows: (1) Project 211. This program, launched in 1993, aims to select and support about 100 universities in China and build frontier universities in the 21st century, with an investment scale of RMB 18 billion Yuan (at current year prices); (2) Project 985. Launched in 1998, it aims to select and support dozens of universities in China to build the excellent and internationally competitive universities with a total investment of more than RMB 30 billion Yuan (at current year prices); (3) Plan 2011. The Higher Education Innovation Capacity Improvement Plan, launched in 2011, is referred to as the Plan 2011, which aims to improve the universities’ innovation capability to face the scientific frontier and continue to build several world-leading universities and disciplines.

The choice of schools that can enter these programs was based on past achievements. Although the general principles of the selection emphasized that research and education are equally important, the most important selection criteria in practice were the number of high-quality international publications. Therefore, many resources for the development of the Chinese science system were concentrated in a handful of universities that had scientific achievements. Until July 2014, there were 2,246 different kinds of higher education institutions in China, among which 782 were public. The institutions involved in Project 211 and Project 985, which represent high research level in China, are only 112, which accounts for less than 15% in public institutions and less than 5% in all higher education institutions. According to the data released by the MOE, from 2009 to 2013, the funding allocated by MOE to all higher education institutions in China reached RMB 264.769 billion Yuan, but Project 211 and Project 985 universities received about 72% of the total funding. Therefore, the remaining more than 2000 higher education institutions received only about RMB 70 billion Yuan with a proportion of about 28% in total

These words refer to http://wap.sciencenet.cn/info.aspx?id=307525, and the statistical data is from the China Education Statistical Yearbooks (2010–2014) issued by MOE.

.

The Higher Education Evaluation Center (HEEC), established in August 2004, is an administrative organization under the auspices of the MOE. The main responsibilities of HEEC include organizing the evaluation of baccalaureate and associate degree programs offered at different universities and colleges in China, conducting research on regulations and policies on higher education reforms and providing recommendations for the decision-makers. HEEC also shoulders the responsibility to develop international cooperation and exchanges with other higher education evaluation/accreditation agencies from other countries as well as from Hong Kong, Macao and Chinese Taiwan (Luo Wang, 2007). On behalf of MOE, HEEC evaluates its subordinate universities in China every five years. The establishment of HEEC indicates the formal higher education assessment system has been built. Among its evaluation framework, international publication (i.e. SCI/SSCI) is also a very important indicator. The evaluation results will affect the resource allocation of the university and the future of directors of universities. Thus, the directors have a strong motivation to push researchers to publish an increasing number of international papers.

The existence of the university assessment system directly affects the design of the university’s internal management system, especially the evaluation and reward system for researchers. To inspire good scientific achievements and obtain more resources from the government, the universities mainly incentivize researchers through the following mechanisms.

First, the salary of researchers is not a fixed amount but largely depends on performance. Normally, it consists of a basic salary and a performance salary. The basic salary is the same for researchers from the same occupational category. However, the performance salary varies largely according to different performance indicators, e.g. teaching, research funding, publications. The workload of teaching activities is based on the teaching time and its quality, which is difficult to measure. Publications, especially international publications (e.g. SCI/SSCI papers), are one of the main factors for researchers in the universities to compete for research funding. Furthermore, the amount of research funding is highly correlated with the researchers’ salary, reputation and the academic impact in this research field. Having enough research funding is also the prerequisite to be able to recruit master and Ph.D. students, post-docs, and other temporary staff.

Second, publication lists are a key factor for the researchers’ promotion. Being a full professor is obviously a prestigious state in the research community, and a relatively big gap exists in salaries between researchers with and without professorship. Besides, researchers without professorship find it hard to conduct research independently because most of the research projects can only be applied and granted to full professors. Therefore, the demand for promotion also stimulates the desire of researchers to publish papers and engage in academic research.

Third, in most Chinese universities, the primary authors (normally the first author or the corresponding author) of international publications will be awarded extra bonus at the end of the year according to the impact factors or other criteria. For example, universities normally award RMB 100 thousand Yuan to the researcher if he (she) could publish a research paper in top journals (e.g. Science, Nature, and Cell) as the primary author. It is very common in Chinese universities, and almost every university does so, but the amount of bonus depends on the university. It is also recorded and termed as a cash-per-publication reward policy in Quan et al. (2017). To the best of our knowledge, the highest bonus reaches RMB 500 thousand Yuan for one top paper on Science, Nature, or Cell in Wuhan University, which is one of the most famous universities in China

http://news.whu.edu.cn/info/1002/41417.htm

.

Fourth, the maintenance of the working position is also related to scientific research outcomes. Researchers without enough publications on either number or quality can be changed to a less important position for logistics or service (e.g. secretary), and the salary will be relatively low. Recently, more and more universities, especially high-level research universities, have enabled the tenure track system, which means the researchers without professorship must leave if they fail to publish enough high-quality papers during a certain period, e.g. six or eight years.

Fifth, for students (especially Ph.D. candidates) in the better Chinese universities, one of the most important qualifications for graduation is to publish a certain number of SCI/SSCI papers (Liu & Zhi, 2012). If the Ph.D. candidates failed to publish enough SCI/SSCI papers, they will not be awarded the Ph.D. degree. For post-docs, they cannot stay in the universities and transform to a permanent position or tenure-track as a lecturer, if they cannot publish satisfactory publications.

Finally, research and development (R&D) application can also improve the income of researchers. This kind of technology transfer behavior is common and popular across the academia (Yong, 1996), and it has greatly promoted the university-industry collaboration. If the research results or outcomes of one researcher can be transferred to the company and obtain revenue from business, the researcher will receive a bonus in return. Thus, it may attract some researchers who do not intend to become full professors in the future, because good publications often need to invest a lot of time and vigor.

Due to these reasons, researchers, especially young researchers without professorship, endure relatively high pressure for promotion, income, and even the positions, which mainly depend on the publications, especially high impact SCI/SSCI publications. Therefore, Chinese researchers in high-level universities have very strong motivation to increase their publication list. This force makes them spend most of their time in writing papers instead of using their time to ensure the high quality of teaching because it is hard to be promoted to a full professor through teaching.

Methodology
Previous studies on higher education efficiency

The issue of higher education efficiency in different countries has been investigated by many researchers. According to the time span covered by the research methods, this literature can be divided into studies on efficiency at a certain point in time and studies on the productivity during a time period with respect to the higher education.

In the first category, Agasisti (2011) conducted a cross-country comparison on the efficiency of tertiary education and analyzed the role of the public sector. Agasisti and Bianco (2009) computed efficiency rankings and scores of Italian universities using DEA and Stochastic Frontier analysis (SFA). Barra et al. (2018) measured the efficiency of Italian higher education using both DEA and SFA techniques and used the results to provide guidance to university managers and policymakers. De França et al. (2010) evaluated the impact of information asymmetry on Brazil organizational efficiency based on an additive DEA approach. Johnes (2013) evaluated the efficiency of higher education institutions in England using the traditional black-box DEA and network DEA. The results suggested that the employability issue was the main reason constraining the growth of institutional efficiency. Singh and Ranjan (2017) proposed a single stage DEA approach, which can measure the efficiency of DMUs and their parallel sub-units simultaneously. Indian universities, colleges and stand-alone institutions were taken as non-homogeneous sub-units in the assessing of their efficiency. Wolszczak-Derlacz (2017) assessed the technical efficiency of European and American higher education institutes (HEIs) using DEA and determined different frontiers (global, regional, and country-specific). The impact of the external factors on the HEI inefficiency were also examined.

Among the productivity researches, Agasisti and Johnes (2009) investigated the technical efficiency and the Malmquist index of Italian and English higher education institutions and found that English institutions were more efficient than Italian ones. Agasisti and Pohl (2012) examined and compared the efficiency of Italian and German public universities and the Malmquist index from 2001 to 2007, which aims to provide useful managerial and policy-making suggestions for the policy-makers in both countries. Barra and Zotti (2016) investigated the technical efficiency of two main activities, teaching and research, on two large groups, the Science and Technology sector and the Humanity and Social Science sector, in a big public university in Salerno from 2005 to 2009. Barros et al. (2011) employed DEA to investigate the nature of the technical change in the French labor market and the productivity change in a sample of higher education leavers from 1999 to 2004. Edvardsen et al. (2017) investigated the productivity development for Norwegian institutions of higher education from 2004 to 2013 using a bootstrapped Malmquist index based on DEA. Thanassoulis et al. (2011) applied DEA and SFA to assess the efficiency of English higher education institutions and revealed that the increase in students and student mix are factors in the improvement of efficiency. They also adopt the Malmquist index in the assessing of the productivity change in the UK higher education and found a downward movement of the institution productivity on average.

MPI

In this paper, we use the MPI to investigate efficiency changes of the sample universities. This index was first introduced by Malmquist (1953) and has been further studied and developed in a non-parametric framework by several authors (e.g. Färe et al. 1994). Odeck (2000) used the MPI in an efficiency and productivity analysis of the Norwegian motor vehicle inspection agencies during 1989–1991. Asmild et al. (2004) conducted the performance over time measurement in Canadian banking industry through the proposed approach combining MPI with DEA window analysis technique, which can treat the panel data with the advantage of dealing with small-size samples. The labor productivity in the European Union area was investigated in Färe et al. (2006). Cooper et al. (2007) showed that the MPI is an index which represents the Total Factor Productivity (TFP) growth of a DMU, and it can reflect (1) the progress or regress in technical efficiency, (2) the changes of the frontier technology between two time periods under the multiple inputs and multiple outputs framework.

Suppose there are m input indicators and s output indicators, and the observation of DMUj (j = 1,2,...,n) in period t(t = 1,2,...,T) can be denoted as (xjt,yjt)Rm+×Rs+.$$\left( x_{j}^{t},y_{j}^{t} \right)\in R_{m}^{+}\times R_{s}^{+}.$$The production possibility set (PPS) can be defined as

Tt={(xt,yt)|j=1nλjtxjtxt,j=1nλjtyjtyt,j=1nλjt=1,λjt0,j}$${{T}^{t}}=\left\{ \left( {{x}^{t}},{{y}^{t}} \right)\left| \sum{_{j=1}^{n}\lambda _{j}^{t}x_{j}^{t}\le {{x}^{t}},}\sum{_{j=1}^{n}\lambda _{j}^{t}y_{j}^{t}\ge {{y}^{t}},\sum{_{j=1}^{n}\lambda _{j}^{t}=1,\lambda _{j}^{t}\ge 0,\forall j}} \right. \right\}$$

where ltj denotes the intensity variable. In this paper, we assume the production technology satisfies the variable returns to scale assumption, so j=1nλjt=1$$\sum{_{j=1}^{n}\lambda _{j}^{t}=1}$$is incorporated in Eq. (1) to specify the shape of the PPS. The PPS is characterized by the frontiers, which are formulated by (xt , yt ) ∈Tt so that under the current technology, the combination of less input and more output cannot be obtained.

In the framework of Malmquist analysis, the efficiency of DMU0 is evaluated by frontiers in different periods through various ways. The MPI is defined as the product of the Catch-up term and the Frontier-shift term (Caves et al., 1982; Färe et al., 1989; Tone, 2004). The Catch-up term captures the efficiency change of a DMU. The Frontier-shift term reflects the technology change between two periods. Therein, the Catch-up term is measured as

catch-up=Theefficiencyof(x02,y02)withrespecttofrontierinperiod2Theefficiencyof(x01,y01)withrespecttofrontierinperiod1$$\text{catch-up}\,\text{=}\frac{\text{The}\,\text{efficiency}\,\text{of}\left( x_{0}^{2},y_{0}^{2} \right)\text{with}\,\text{respect}\,\text{to}\,\text{frontier}\,\text{in}\,\text{period}\,2}{\text{The}\,\text{efficiency}\,\text{of}\left( x_{0}^{1},y_{0}^{1} \right)\text{with}\,\text{respect}\,\text{to}\,\text{frontier}\,\text{in}\,\text{period}\,1}$$

The efficiency of every DMU can be measured by suitable DEA models. We use a simple case with a single input and a single output to illustrate the Catch-up effect, as shown in Figure 2.

Figure 2

An illustrative case of the Malmquist index.

Hence, the input-oriented Catch-up effect can be measured as

Catch-up=OB/OEOC/OF$$\text{Catch-up}\,=\frac{OB\text{/}OE}{OC\text{/}OF}$$

The Catch-up value larger than (equals to, smaller than) unity represents an increase (invariability, decrease) in the relative efficiency of DMU0.

The Frontier-shift term can be embodied in the quotient of the efficiency of (x01,y01)$$\left( x_{0}^{1},\,y_{0}^{1} \right)$$under different frontiers as

Frontier-shift1=Theefficiencyof(x01,y01)withrespecttofrontierinperiod1Theefficiencyof(x01,y01)withrespecttofrontierinperiod2$$\text{Frontier-shif}{{\text{t}}^{1}}=\frac{\text{The}\,\text{efficiency}\,\text{of}\,\left( x_{0}^{1},y_{0}^{1} \right)\,\text{with}\,\text{respect}\,\text{to}\,\text{frontier}\,\text{in}\,\text{period}\,\text{1}}{\text{The}\,\text{efficiency}\,\text{of}\,\left( x_{0}^{1},y_{0}^{1} \right)\,\text{with}\,\text{respect}\,\text{to}\,\text{frontier}\,\text{in}\,\text{period}\,2}$$

where the measurement of the denominator adopts the efficiency of (x01,y01)$$\left( x_{0}^{1},y_{0}^{1} \right)$$relative to the frontier in period 2. Similarly, the Frontier-shift effect of (x02,y02)$$(x_{0}^{2},y_{0}^{2})$$is estimated by

Frontier-shift2=Theefficiencyof(x02,y02)withrespectto frontierinperiod1Theefficiencyof(x02,y02)withrespectto frontierinperiod2$$\text{Frontier-shif}{{\text{t}}^{2}}=\frac{\text{The}\,\text{efficiency}\,\text{of}\,(x_{0}^{2},y_{0}^{2})\,\text{with}\,\text{respect}\,\text{to frontier}\,\text{in}\,\text{period}\,\text{1}}{\text{The}\,\text{efficiency}\,\text{of}\,(x_{0}^{2},y_{0}^{2})\,\text{with}\,\text{respect}\,\text{to frontier}\,\text{in}\,\text{period}\,2}$$

Therefore, the Frontier-shift effect is defined as the geometric mean of Eq. (4) and Eq. (5) as

Frontier-shift=Frontier-shift1×Frontier-shift2$$\text{Frontier-shift}\,\text{=}\sqrt{\text{Frontier-shif}{{\text{t}}^{1}}\times \text{Frontier-shif}{{\text{t}}^{2}}}$$

Using the illustrative case in Figure 2, the input-oriented Frontier-shift effect can be measured as

Frontier-shift=OC/OFOD/OEOA/OFOB/OE$$\text{Frontier-shift}\,\text{=}\sqrt{\frac{OC/OF\,OD/OE}{OA/OF\,OB/OE}}$$

The Frontier-shift value larger than (equals to, smaller than) unity denotes progress (invariability, regress) in the frontier technology. Consequently, the MPI can be written as the product of the Catch-up term and the Frontier-shift term

MPI=Catch-up×Frontier-shift$$MPI=\text{Catch-up}\times \text{Frontier-shift}\,$$

Following the technical efficiency derived from the input-based DEA model and the traditional radial directional distance in Shephard (1970), the efficiency score of DMU0 in period t relative to the frontier in period t + 1 can be notated using the corresponding directional distance function as

Dt+1(x0t,y0t)=min{θ|(θx0t,y0t)Tt+1}$${{D}^{t+1}}\left( x_{0}^{t},y_{0}^{t} \right)=\min \left\{ \theta \left| (\theta x_{0}^{t},y_{0}^{t})\in {{T}^{t+1}} \right. \right\}$$

which measures the maximum extent that DMU0 can contract its input and maintain its output. Using this kind of notation, the Catch-up term in Eq. (2) is rewritten as

Catch-up(t,t+1)=Dt+1(x0t+1,y0t+1)Dt(x0t,y0t)$$\text{Catch-up}\left( t,t+1 \right)=\frac{{{D}^{t+1}}(x_{0}^{t+1},y_{0}^{t+1})}{{{D}^{t}}(x_{0}^{t},y_{0}^{t})}$$

The Frontier-shift term in Eq. (7) is expressed as

Frontier-Shift(t,t+1)=Dt(x0t,y0t)Dt(x0t+1,y0t+1)Dt+1(x0t,y0t)Dt+1(x0t+1,y0t+1)$$\text{Frontier-Shift}\left( t,t+1 \right)=\sqrt{\frac{{{D}^{t}}(x_{0}^{t},y_{0}^{t})\,{{D}^{t}}(x_{0}^{t+1},y_{0}^{t+1})}{{{D}^{t+1}}(x_{0}^{t},y_{0}^{t})\,{{D}^{t+1}}(x_{0}^{t+1},y_{0}^{t+1})}}\,$$

If we multiply these two terms together, the MPI is obtained as

M(t,t+1)=Dt(x0t+1,y0t+1)Dt+1(x0t+1,y0t+1)Dt(x0t,y0t)Dt+1(x0t,y0t)$$M\left( t,t+1 \right)=\sqrt{\frac{{{D}^{t}}(x_{0}^{t+1},y_{0}^{t+1}){{D}^{t+1}}(x_{0}^{t+1},y_{0}^{t+1})}{{{D}^{t}}(x_{0}^{t},y_{0}^{t})\,\,\,\,\,\,\,\,{{D}^{t+1}}(x_{0}^{t},y_{0}^{t})}}$$

An MPI index exceeding one indicates progress in the TFP of DMU0 from period t to period t + 1, while an MPI index equalling one or less than one indicates the stability or deterioration in the TFP, respectively.

Interested readers can refer to Cooper et al. (2007) for details. Pastor and Lovell (2005) argued that MPI fails to satisfy circularity and may encounter infeasibility problem. Concretely, the geometric mean MPI is not circular because the choice of time interval will cause the alteration of frontiers and different measures of productivity change. The infeasibility problem sometimes occurs when the data points are outside the frontier and they are impossible to project onto the frontier. However, in this paper, circularity is not our focus, and we observe no infeasibility problem in the case study.

Scientific Productivity in Chinese universities
Indicators and data sources

The higher education system absorbs a large amount of governmental financial investment, and it plays an important role in creating, producing, and popularizing knowledge in society. With rapid economic development and social progress, the issue of higher education efficiency has always been a topic of concern, and an increasing number of scholars conducted research on the performance of the higher education system. We summarize some of the existing literature on the higher education issue, and the indicators they used are listed in Table 2.

Input and output indicators in the existing literature.

Authors (Year)Input indicatorsOutput indicators
Agasisti (2011)Expenditure, Entry rates, Students: teachersPopulation, Graduation rates, Employment, Foreign students
Agasisti and Bianco (2009)Costs for non-academic staff, Costs for academic staff, Costs tor all staff, Other costs, Total costsStudents enrolled in scientific courses, Students enrolled in non-scientific courses, Total number of students, Ph.D. students, External funds for research activities
Agasisti and Johnes (2009)Total number of students, Total amount of financial resources/incomes, Number of Ph.D. students, Number of academic staffNumber of graduates, Total amount of external grants and contracts for research
Agasisti and Pohl (2012)Students, Academic staff, ExpendituresGraduates, External research
Barra et al. (2018)Number of academic staff, Percentage of enrolments with a score higher the 9/10 in secondary school, Percentage of enrolments who attended a lyceum, Total number of studentsNumber of graduates weighted by their degree classification, Research grants
Barra and Zotti (2016)Equivalent personnel, Total amount of financial resources the department spends on research activities, Total amount of financial resources the faculty allocates for teaching activities, Total number of students enrolledNumber of publications, Total external research funding obtained by the university, Research productivity index/ capacity of attracting resources index/ research productivity per cost of the academic staff index, Number of graduates weighted by their degree classification, Student satisfaction index/ undergraduate satisfaction index
Barros et al. (2011)Number of educational years in higher education, Experience of the workers without student jobWages
Edvardsen et al. (2017)Faculty employees, Administration and other employeesStudy points for courses of a lower degree, Study points for courses of a higher degree, Publishing points, Doctorates/Ph.Ds.
de França et al. (2010)Total number of professors with Ph.D. Degrees, Total number of support and administrative personnel, Total number of majors offeredTotal number of students that graduated during the year, Total number of candidates registered to take the entrance examination, Total number of students enrolled at the university
Johnes (2013)First node: Intake quality, Student: staff ratio, Per student spend. Second node: Degree results, Research reputationFirst node: Student satisfaction, Degree results. Second node: Employability
Singh and Ranjan (2017)Number of teachers (Professors and Associate Professors), Number of teachers (Assistant Professors and others), Number of non-teaching staff.Students enrolled in Ph.D. and M. Phil., Students enrolled in P.G. and P.G. Diploma, Students enrolled in U.G., Students enrolled in Diploma, Certificate, and Integrated courses, Students enrolled in P.G. and P.G. Diploma open courses,
Thanassoulis et al. (2011)Total operating costStudents enrolled in U.G. open courses. Undergraduates in medicine or dentistry, Undergraduate science students, Undergraduate non-science students, Postgraduate students in all disciplines, Quality-related funding and research grants, Income from other services
Wolszczak-Derlacz (2017)Academic staff, Total revenue, Total number of students, Non-academic staffPublications, Graduates, Scientific articles

As stated in Schmoch and Schubert (2009) and Schmoch et al. (2010), teaching, researching and serving the society (mainly reflects in the application of R&D outputs and technology services) are three main functional activities undertaken by universities. The independent functional activities are distinguished into three abstract schemes. Following the indicators used in the previous literature, we select three input indicators corresponding to the three functions, and each one is used in

the input-based DEA model to measure the MPI in a scheme separately. The indicators include the number of teaching staff (TEACHING STAFF), the number of staff who spend most of their time on R&D activities (R&D STAFF), and the number of staff who spend most of their time on the application of R&D outputs and technology services (APPLICATION STAFF). TEACHING STAFF stands for personnel with teacher qualifications and specialized in teaching activities. R&D STAFF refers to personnel belongs to the affiliated research organization of universities. APPLICATION STAFF is represented by personnel engaged in the application and service of scientific research achievements.

Furthermore, we use the total number of staff (TOTAL STAFF) at the university as the single input of Scheme 1, which aims to reflect the overall input-output condition within the university. All these four schemes employ single input indicator because there is heavy overlap between staff and the allocated funding in the universities. To identify the potential driving force of the overall productivity evolution (Scheme 1), we designed three sub-schemes to analyze the productivity evolution in teaching (Scheme 2), researching (Scheme 3), and application (Scheme 4) activities in Chinese higher education. Figure 3 shows how these input and output indicators are organized in each of the four schemes.

Figure 3

Input and output indicators in different schemes.

The output indicators are the number of students in the university each year (STUDENTS), SCI/SSCI publications (SCI PUB.), and technology transfer income (TT INCOME). In details, STUDENTS denotes the number of undergraduate, master and Ph.D. students taught or trained in the universities each year. It seems that research as an important mission of universities has often dominated the other core mission of higher education, teaching, in recent years. However, it is necessary to recognize that teaching or training students to provide high-quality human resources for the future is of considerable importance for the sustainability of universities in the long run. A variety of measures of teaching or training students have been developed and used in extensive existing literature (e.g. Avkiran, 2001;

Glass et al., 2006; Schubert, 2014; Worthington & Lee, 2008). Merton (1957, 1968) pointed out that publication is an important step to establish the priority of scientific discovery, which is one of the main objectives for scientists. Therefore, most research institutions (including universities and research institutes) prefer to assess the performance of the researchers using publications and citations (Zhang et al. 2011). Thus, publications are one of the important output indicators for the measurement of efficiency or scale characteristics of research institutions in practice (e.g. Rousseau & Rousseau 1997; Schubert, 2009; Schmoch & Schubert, 2009; Schubert, 2014, Yang et al., 2014). Technology transfer is a process through which technical information and products developed by the Federal Government and are provided to potential users, in a manner that encourages and accelerates their evaluation and/or use

http://www.usgs.gov/tech-transfer/

. In this paper, we use TT INCOME to represent the revenue of R&D outputs and technology services for those universities. Specifically, STUDENTS, SCI PUB. and TT INCOME are used as the three output indicators in Scheme 1, and each of them is separately used as the output indicator in the remaining three schemes.

This paper intends to conduct research on the universities directly managed by the MOE of China. These universities have been always leading the direction of China’s higher education. China Education Statistical Yearbook 2015 reveals that more than RMB 67.5 billion Yuan concentrated in the 75 HEIs under the direct management of MOE, which accounts for almost half of the Science and technology expenditure in Chinese regular HEIs. In terms of publications, these universities published 47.4% articles of Chinese regular HEIs on foreign journals and 18.1% on domestic journals in 2014. Besides, 43.6% of the actual revenue of the technology transfer in Chinese regular HEIs came from these universities in 2014. We further remove a few special universities, including art universities, finance universities, and language universities (e.g. Central Conservatory of Music, China Central Academy of Fine Arts) because those universities do not aim to promote scientific and technical progress. Besides, China University of Mining, China University of Petroleum and China University of Geosciences are three universities possess branch schools. The branch schools do not exist separately but independently accounted in Chinese statistical system, so these six universities are also eliminated from our samples. Hence, 58 Chinese universities are considered as our samples. Those sample universities are all Project 211 universities, which are high-level research universities and are representatives of Chinese high-level research universities.

Our data of indicators TEACHING STAFF, R&D STAFF, APPLICATION STAFF and TT INCOME are collected from the Compilation of Science and technology statistics material of colleges and universities (2010–2017) issued by MOE of China. Data of R&D STAFF, APPLICATION STAFF and another indicator, teaching and research staff, are directly provided in the compilations. Teaching and research staff represents the personnel that engaged in teaching and R&D activities in higher education institutions including foreign experts and visiting scholars who have been engaged in scientific research activities for more than one month. R&D STAFF is denoted as teaching and research personnel who spend more than 10% of their working time in scientific research activities. Therefore, we obtain the number of TEACHING STAFF by subtracting the number of R&D STAFF from the number of teaching and research staff. Data of indicator SCI PUB. are obtained from the InCites database

https://clarivate.com/products/incites/

provided by Clarivate Analytics on Oct 25, 2018. Due to the discontinuity of the data sources, we obtain data of indicator STUDENTS from the official documents released by our sample universities on the enrollment plan from 2009 to 2013 and reports on the quality of graduate employment of the sample universities from 2013 to 2016. Therefore, our analysis of the results is separated into two time periods, 2009–2013 and 2013–2016.

Empirical Results

As the highest-level educational institutions and the major research institutions in the country, Chinese universities assume the three basic responsibilities for students training, scientific researching and social services. The input and output efficiency and productivity of universities not only have a direct impact on the quality of talents and the level of scientific research but also embody the potential of science and technology innovation in this country. In this Section, we use the MPI to investigate the efficiency evolution of the sample universities in the period 2009–2016.

The measurement of the MPI index of Scheme 1 reflects the trend of overall productivity of the sample universities during the investigated period. Furthered by decompositions results, a comprehensive understanding of the development of the higher education industry can be obtained from the improvement of the industry’s internal frontier technology and the progress of universities in the pursuit of the best-performed universities. The results are shown in Table 3. The overall TFP consistently exceeded unity, which shows that these sample universities continuously improved their development in all aspects and achieved an average 10.3% increase in the overall efficiency. On average, the technical efficiency was growing at an

Overall TFP and its decompositions (Scheme 1).

PeriodsCatch-upFrontier-shiftOverall TFP
2009–20100.8161.2721.037
2010–20110.9591.1741.126
2011–20121.4850.8121.206
2012–20130.9431.1701.102
2013–20140.9781.1261.101
2014–20151.0451.0421.089
2015–20160.9411.1331.066
Mean1.0081.0951.103

average annual rate of 0.8%, and the technological progress increased by 9.5% annually. Therefore, the technological progress (Frontier-shift effect) was the main reason for the growth of overall TFP in Chinese universities. Concretely, the Frontier-shift effect declined from 1.272 to 0.812 from 2009–2010 to 2011–2012 and then performed decently with the around 1.1 scores during the rest investigated periods. The overall performance of the Catch-up effect was unstable because the Catch-up scores fluctuate around unity, and it exceeded unity only in the periods 2011–2012 and 2014–2015. These trends are exhibited in the three polylines in Figure 4. There is an evidently different period 2011–2012, compared with other periods, with an extremely high level of overall TFP but abnormal Catch-up and Frontier-shift values, which may be caused by the promulgation of Outline of Chinese National Medium- and Long-Term Education Reform and Development Plan in 2010. In response to the problems in Chinese higher education, it summarized the experience gained in practice and formulated several strong policies, especially the university de-administrative policy. It aims to weaken the administrative characteristic in colleges and universities and explore the full potential of academic resources. Meanwhile, a batch of management systems in line with the characteristics of colleges and universities have also been introduced.

Figure 4

Line chart of the overall TFP and its decompositions in Scheme 1.

To examine the changes in the efficiency evolution of teaching activities during the studied period and identify their contributions to the overall TFP, Scheme 2 intends to measure the changes in the efficiency of teaching activities. According to the calculation results shown in Table 4, the teaching TFP was consistently stable at around unity with an average teaching TFP value of 0.992, indicating that higher education practitioners failed to improve the efficiency of teaching activities during this time. The changes in the two decomposition components show an exactly opposite trend. During the periods 2009–2010, 2012–2014 and 2015–2016, the Catch-up effect stays at a high level, but the Frontier-shift effect represents a poor performance. During the remaining periods, the Frontier-shift effect performs better

Teaching TFP and its decompositions (Scheme 2).

PeriodsCatch-upFrontier-shiftTeaching TFP
2009–20101.7430.5580.972
2010–20110.8441.1200.946
2011–20120.8611.1801.016
2012–20131.6510.6040.997
2013–20141.1080.9281.029
2014–20150.9811.0881.067
2015–20161.3700.6730.923
Mean1.1760.8430.992

than the Catch-up effect. We also employ the line chart to display the calculation results, as shown in Figure 5. Comparing with the well-performing overall TFP in Scheme 1, we can determine that the reason for the well-performing overall TFP was not the productivity of teaching activities. Actually, the low average TFP value of the teaching activities has seriously hindered the growth of the overall TFP.

Figure 5

Line chart of the teaching TFP and its decompositions in Scheme 2.

The productivity of scientific research activities during 2009–2016 and their decompositions are also calculated in Scheme 3, and the results can be adopted to confirm the contribution of scientific research activities on the overall TFP. The results are shown in Table 5, illustrating that the research TFP maintained at a high level above 1.1. This indicates that the efficiency of the scientific research activities continues to improve, which is consistent with the scientific researchers’ enthusiasm for researching activities and the government’s incentive schemes promoting research. The two decompositions of the research TFP show different trends during the studied period. From the period 2009–2010 to the period 2012–2013, the Catch-up effect increased continuously from 0.839 to 1.367, while the Frontier-shift

Research TFP and its decompositions (Scheme 3).

PeriodsCatch-upFrontier-shiftResearch TFP
2009–20100.8391.3271.113
2010–20110.9841.2141.195
2011–20121.0441.1351.185
2012–20131.3670.8261.129
2013–20141.0801.0211.102
2014–20151.0751.0591.139
2015–20160.9491.1841.123
Mean1.0381.0991.140

effect decreased from the initial 1.327 to 0.826. Afterward, the Catch-up effect dropped to 0.949 promptly, while the Frontier-shift effect bounced back to 1.184 until the end of the period. Therefore, we can conclude that the relatively high scores of the research TFP in Scheme 3 derive from two balanced and high-valued decompositions. The changes in the values of research TFP and its decompositions of Scheme 3 are shown in Figure 6. What’s more, the excellent performance of the research TFP is highly consistent with the overall TFP in Scheme 1, which shows that the scientific research activities of universities have played a positive role in promoting the efficiency of the entire higher education system.

Figure 6

Line chart of the research TFP and its decompositions in Scheme 3.

As the main innovator of science and technology, universities have the responsibility and ability to carry out scientific and technological achievements transformation activities. The efficiency of the technology transformation has a crucial impact on the overall efficiency of the university. The evaluation of the technology transformation efficiency is also an indispensable part of the performance

analysis of the university. It should be noted that the technology transformation function is not involved in the operation of certain universities, so these DMU are removed from the calculation to guarantee the reasonableness of the scores. The results in Table 6 indicate that the transfer TFP fluctuated back and forth around unity during the whole period. According to the volatility of the two decompositions, the whole research period can be divided into two parts. Prior to the period 2012–2013, the transfer TFP shows an obvious consistency with the fluctuation of the Catch-up effect. Thereafter, the Frontier-shift effect reveals dominance over the transfer TFP. The calculation results of Scheme 4 are shown in the form of line chart in Figure 7. In general, the value of transfer TFP is relatively low, and it may make a weak negative impact on the promotion of the overall TFP.

Figure 7

Line chart of the transfer TFP and its decompositions in Scheme 4.

Transfer TFP and its decompositions (Scheme 4).

PeriodsCatch-upFrontier-shiftTransfer TFP
2009–20100.6471.4650.949
2010–20111.0991.0751.181
2011–20120.9831.0351.017
2012–20130.6461.3410.866
2013–20142.4350.3130.761
2014–20151.0181.6591.689
2015–20162.1180.4020.851
Mean1.1310.8941.011

In summary, the TFP values calculated by the four schemes can reflect the main features of Chinese universities’ development during 2009–2016. The average value of the overall TFP (Scheme 1) for this period is 1.103, and the Frontier-shift effect contributed as the main influencing factor. This indicates that Chinese universities

have successfully improved their overall performance by improving the frontier technology during the investigated period. From the perspective of the performance of different functions of universities, the average values of the research TFP (Scheme 3) and the transfer TFP (Scheme 4) are 1.140 and 1.011, respectively, while the corresponding teaching TFP (Scheme 2) is less than 1 (0.992). To accurately determine the relationship between the overall TFP and the TFPs of different schemes, we merged the TFPs in four schemes together and demonstrate them in the radar map in Figure 8. The results show that the polyline which represents the overall TFP lies mostly inside the polyline of the research TFP, and the polyline of the teaching TFP always locates mostly outside the polyline of the overall TFP. However, the polylines of the transfer TFP and the overall TFP are interlaced during the investigated period, and the transfer TFP only exceeds the overall TFP in the periods of 2010–2011 and 2014–2015. This apparent difference fully demonstrates that the current guideline of the Chinese government, which attaches great importance to the scientific research in the evaluation and incentive policies, has made a real impact on the performance of different activities carried out in the universities. If such trends continue, the Chinese higher education quality may run a risk of deterioration, and the training and reservation of talents will be severely affected. Furthermore, this implies a hidden danger for the rapid development of the society.

Figure 8

Comparison of the overall TFP and TFPs in different schemes.

Policy implications

Through the description of the education-related policies in Chinese science system and universities, as well as the analysis of the productivity results, we have obtained three main findings as follows.

First, since the end of the Cultural Revolution, China has carried out many educational reforms in the education system and has introduced numerous management policies. As an important carrier of education, technological innovation and social services, Chinese universities undertake the important mission of education, research and achievements transformation. However, there is a prevalent biased guideline that particularly emphasizes the importance of scientific research in China’s incentive and evaluation system from the state to universities and from universities to educators. This has enabled yearly increased China’s scientific research outputs, and the scientific research efficiency has also reached a steady increase.

Second, the overall TFP has consistently exceeded one and has an average value of 1.103, which indicates that these sample universities have continuously improved their efficiency. Overall technical efficiency increased by 0.8% on average, while technological progress increased by 9.5% on average annually. Therefore, the technological progress was the main driven force for the growth of overall TFP in Chinese universities.

Third, the teaching TFP performs significantly poorer than the research TFP and the transfer TFP. Specifically, the annual average teaching TPF is only 0.992 during the studied period, while the research TFP and the transfer TFP are 1.140 and 1.011, respectively. Therefore, we can determine that the teaching activities was not received general attention from colleges and universities.

Based on the above findings, we propose the following three policy suggestions for Chinese universities and the higher education administrative departments.

First, the Chinese government should coordinate the three missions of Chinese universities for teaching, scientific researching and achievements transforming and promote their joint development. To improve the teaching quality and efficiency, it is necessary to create more opportunities in the universities to do teacher training and upgrade teaching equipment so that students can acquire more knowledge and skills in more extensive teaching activities. This can also provide a platform for the development of scientific research and provide a basis for the industrialization of scientific research and its achievements. Meanwhile, we suggest that Chinese education administrative departments should pay more attention to the guidance of university teaching and increase the resources invested in the teaching activities to reach the balanced development in Chinese universities.

Second, it is necessary for the Chinese government to carry on supporting the comprehensive development of high-level universities, which can serve as models or frontiers for other universities and motivate the Catch-up effect among different universities, so that the positive impact of Frontier-shift effect on the overall TFP can be fully realized.

Third, the Chinese government should strengthen the emphasis on the evaluation of teaching efficiency in universities and establish a comparative mechanism among universities. For example, educational resources can be allocated based on the comparison of teaching efficiency between universities in the previous years. Universities with higher teaching efficiency will share more educational resources, otherwise less educational resources will be allocated, and thereby the sense of competition in the teaching activities among universities will be enhanced.

Conclusions and discussions

The primary objective of this paper is to clarify the development path of China’s higher education from the aspects of public policy and efficiency evolution. To address these issues, we first summarize the science policies since the end of the Cultural Revolution and the educational resources allocation and reward regulations in Chinese society and inside the universities. The DEA-based Malmquist index and its decompositions are used to measure and explain the productivity of Chinese universities. In accordance with the main functions of higher education, we designed four schemes and determined their input and output indicators with reference to the existing research on educational efficiency.

Based on a rich dataset of Chinese universities directly managed by the MOE of China, we conduct the quantitative computation and analysis on 58 sample universities for grasping the productivity status and the evolution of Chinese higher education system and figuring out the core factors that impact the overall TFP. From the empirical results, our main results suggest that policies implemented in the higher education system after the end of the Cultural Revolution in China have greatly promoted the rapid development of higher education in China, but there is still a problem of the unbalanced development between different functions. The sample universities increased their productivity on average by 10.3% annually, and this increase was mainly driven by the Frontier-shift effect. From the perspective of the functions of higher education, the continuous low productivity of teaching activities has played a negative role in improving overall TFP, while the efficiency of scientific research and achievement transformation activities maintained at a decent level, which greatly stabilized and promoted the long-term and high-speed promotion of the overall TFP in Chinese universities. We also put forward the corresponding policy suggestions for policymakers on the selection of educational functions and college levels that educational resources can focus on and point out a proposal of the evaluation of teaching efficiency in the process of university management.

Nevertheless, several points outside our interest are left for future research. First, we employ the commonly used input and output indicators related to higher education. The adoption of indicators which reflect the impact and quality of outputs will serve as the focus for our future research. Second, we conduct an investigation on an eight-year period in this paper, which can be extended in future research. Third, the circularity problem of the MPI index has been investigated through different techniques, so the choice and application of a reasonable solution for the research of scientific productivity are of great value.

Figure 1

GERD of several countries from 2001 to 2016 (millions in 2017 constant USD). Note: The data of China doesn’t include the statistical data of Chinese Taiwan.
GERD of several countries from 2001 to 2016 (millions in 2017 constant USD). Note: The data of China doesn’t include the statistical data of Chinese Taiwan.

Figure 2

An illustrative case of the Malmquist index.
An illustrative case of the Malmquist index.

Figure 3

Input and output indicators in different schemes.
Input and output indicators in different schemes.

Figure 4

Line chart of the overall TFP and its decompositions in Scheme 1.
Line chart of the overall TFP and its decompositions in Scheme 1.

Figure 5

Line chart of the teaching TFP and its decompositions in Scheme 2.
Line chart of the teaching TFP and its decompositions in Scheme 2.

Figure 6

Line chart of the research TFP and its decompositions in Scheme 3.
Line chart of the research TFP and its decompositions in Scheme 3.

Figure 7

Line chart of the transfer TFP and its decompositions in Scheme 4.
Line chart of the transfer TFP and its decompositions in Scheme 4.

Figure 8

Comparison of the overall TFP and TFPs in different schemes.
Comparison of the overall TFP and TFPs in different schemes.

Teaching TFP and its decompositions (Scheme 2).

PeriodsCatch-upFrontier-shiftTeaching TFP
2009–20101.7430.5580.972
2010–20110.8441.1200.946
2011–20120.8611.1801.016
2012–20131.6510.6040.997
2013–20141.1080.9281.029
2014–20150.9811.0881.067
2015–20161.3700.6730.923
Mean1.1760.8430.992

Research TFP and its decompositions (Scheme 3).

PeriodsCatch-upFrontier-shiftResearch TFP
2009–20100.8391.3271.113
2010–20110.9841.2141.195
2011–20121.0441.1351.185
2012–20131.3670.8261.129
2013–20141.0801.0211.102
2014–20151.0751.0591.139
2015–20160.9491.1841.123
Mean1.0381.0991.140

Input and output indicators in the existing literature.

Authors (Year)Input indicatorsOutput indicators
Agasisti (2011)Expenditure, Entry rates, Students: teachersPopulation, Graduation rates, Employment, Foreign students
Agasisti and Bianco (2009)Costs for non-academic staff, Costs for academic staff, Costs tor all staff, Other costs, Total costsStudents enrolled in scientific courses, Students enrolled in non-scientific courses, Total number of students, Ph.D. students, External funds for research activities
Agasisti and Johnes (2009)Total number of students, Total amount of financial resources/incomes, Number of Ph.D. students, Number of academic staffNumber of graduates, Total amount of external grants and contracts for research
Agasisti and Pohl (2012)Students, Academic staff, ExpendituresGraduates, External research
Barra et al. (2018)Number of academic staff, Percentage of enrolments with a score higher the 9/10 in secondary school, Percentage of enrolments who attended a lyceum, Total number of studentsNumber of graduates weighted by their degree classification, Research grants
Barra and Zotti (2016)Equivalent personnel, Total amount of financial resources the department spends on research activities, Total amount of financial resources the faculty allocates for teaching activities, Total number of students enrolledNumber of publications, Total external research funding obtained by the university, Research productivity index/ capacity of attracting resources index/ research productivity per cost of the academic staff index, Number of graduates weighted by their degree classification, Student satisfaction index/ undergraduate satisfaction index
Barros et al. (2011)Number of educational years in higher education, Experience of the workers without student jobWages
Edvardsen et al. (2017)Faculty employees, Administration and other employeesStudy points for courses of a lower degree, Study points for courses of a higher degree, Publishing points, Doctorates/Ph.Ds.
de França et al. (2010)Total number of professors with Ph.D. Degrees, Total number of support and administrative personnel, Total number of majors offeredTotal number of students that graduated during the year, Total number of candidates registered to take the entrance examination, Total number of students enrolled at the university
Johnes (2013)First node: Intake quality, Student: staff ratio, Per student spend. Second node: Degree results, Research reputationFirst node: Student satisfaction, Degree results. Second node: Employability
Singh and Ranjan (2017)Number of teachers (Professors and Associate Professors), Number of teachers (Assistant Professors and others), Number of non-teaching staff.Students enrolled in Ph.D. and M. Phil., Students enrolled in P.G. and P.G. Diploma, Students enrolled in U.G., Students enrolled in Diploma, Certificate, and Integrated courses, Students enrolled in P.G. and P.G. Diploma open courses,
Thanassoulis et al. (2011)Total operating costStudents enrolled in U.G. open courses. Undergraduates in medicine or dentistry, Undergraduate science students, Undergraduate non-science students, Postgraduate students in all disciplines, Quality-related funding and research grants, Income from other services
Wolszczak-Derlacz (2017)Academic staff, Total revenue, Total number of students, Non-academic staffPublications, Graduates, Scientific articles

The expansion of Chinese higher education from 1998 to 2007.

Year1998199920002001200220032004200520062007
Enrollment number108.4159.7220.6268.3320.5382.2447.3504.5546.1565.9
(Units: 10 thousand)
University number (Units: pcs)1,0221,0711,0411,2251,3961,5521,7311,7921,8671,908
Enrollment rate (Unit: %)9.810.512.212.915.017.019.021.022.023.0

Transfer TFP and its decompositions (Scheme 4).

PeriodsCatch-upFrontier-shiftTransfer TFP
2009–20100.6471.4650.949
2010–20111.0991.0751.181
2011–20120.9831.0351.017
2012–20130.6461.3410.866
2013–20142.4350.3130.761
2014–20151.0181.6591.689
2015–20162.1180.4020.851
Mean1.1310.8941.011

Overall TFP and its decompositions (Scheme 1).

PeriodsCatch-upFrontier-shiftOverall TFP
2009–20100.8161.2721.037
2010–20110.9591.1741.126
2011–20121.4850.8121.206
2012–20130.9431.1701.102
2013–20140.9781.1261.101
2014–20151.0451.0421.089
2015–20160.9411.1331.066
Mean1.0081.0951.103

Agasisti, T. (2011). Performances and spending efficiency in higher education: A European comparison. Social Science Electronic Publishing, 19(2), 199–224.AgasistiT.2011Performances and spending efficiency in higher education: A European comparisonSocial Science Electronic Publishing192199224Search in Google Scholar

Agasisti, T., & Bianco, A.D. (2009). Measuring efficiency of higher education institutions. International Journal of Management & Decision Making, 10(5–6), 443–465(23).AgasistiT.BiancoA.D.2009Measuring efficiency of higher education institutionsInternational Journal of Management & Decision Making105–64434652310.1504/IJMDM.2009.026687Search in Google Scholar

Agasisti, T., & Johnes, G. (2009). Beyond frontiers: Comparing the efficiency of higher education decision-making units across more than one country. Education Economics, 17(1), 59–79.AgasistiT.JohnesG.2009Beyond frontiers: Comparing the efficiency of higher education decision-making units across more than one countryEducation Economics171597910.1080/09645290701523291Search in Google Scholar

Agasisti, T., & Pohl, C. (2012). Comparing German and Italian public universities: Convergence or divergence in the higher education landscape? Managerial & Decision Economics, 33(2), 71–85.AgasistiT.PohlC.2012Comparing German and Italian public universities: Convergence or divergence in the higher education landscape?Managerial & Decision Economics332718510.1002/mde.1561Search in Google Scholar

Asmild, M., Paradi, J.C., Aggarwall, V., & Schaffnit, C. (2004). Combining DEA window analysis with the Malmquist index approach in a study of the Canadian banking industry. Journal of Productivity Analysis, 21(1), 67–89.AsmildM.ParadiJ.C.AggarwallV.SchaffnitC.2004Combining DEA window analysis with the Malmquist index approach in a study of the Canadian banking industryJournal of Productivity Analysis211678910.1023/B:PROD.0000012453.91326.ecSearch in Google Scholar

Avkiran, N.K. (2001). Investigating technical and scale efficiencies of Australian Universities through data envelopment analysis. Socio-Economic Planning Sciences 35, 57–80.AvkiranN.K.2001Investigating technical and scale efficiencies of Australian Universities through data envelopment analysisSocio-Economic Planning Sciences35578010.1016/S0038-0121(00)00010-0Search in Google Scholar

Barra, C., Lagravinese, R., & Zotti, R. (2018). Does econometric methodology matter to rank universities? An analysis of Italian higher education system. Socio-Economic Planning Sciences, 62, 104–120.BarraC.LagravineseR.ZottiR.2018Does econometric methodology matter to rank universities? An analysis of Italian higher education systemSocio-Economic Planning Sciences6210412010.1016/j.seps.2017.09.002Search in Google Scholar

Barra, C., & Zotti, R. (2016). Measuring efficiency in higher education: An empirical study using a bootstrapped data envelopment analysis. International Advances in Economic Research, 22(1), 11–33.BarraC.ZottiR.2016Measuring efficiency in higher education: An empirical study using a bootstrapped data envelopment analysisInternational Advances in Economic Research221113310.1007/s11294-015-9558-4Search in Google Scholar

Barros, C.P., Guironnet, J.P., & Peypoch, N. (2011). Productivity growth and biased technical change in French higher education. Economic Modelling, 28(1–2), 641–646.BarrosC.P.GuironnetJ.P.PeypochN.2011Productivity growth and biased technical change in French higher educationEconomic Modelling281–264164610.1016/j.econmod.2010.06.005Search in Google Scholar

Caves, D.W., Christensen, L.R., & Diewert, W.E. (1982). The economic theory of index numbers and the measurement of input output, and productivity. Econometrica, 50, 1393–1414.CavesD.W.ChristensenL.R.DiewertW.E.1982The economic theory of index numbers and the measurement of input output, and productivityEconometrica501393141410.2307/1913388Search in Google Scholar

Cooper, W.W., Seiford, L.M., & Tone, K. (2007). Data envelopment analysis: A comprehensive text with models, applications, references and DEA-Solver Software (Second Edition). New York: Springer.CooperW.W.SeifordL.M.ToneK.2007Data envelopment analysis: A comprehensive text with models, applications, references and DEA-Solver Software (Second Edition)New YorkSpringer10.1007/978-0-387-45283-8Search in Google Scholar

Edvardsen, D.F., Førsund, F.R., & Kittelsen, S.A.C. (2017). Productivity development of norwegian institutions of higher education 2004–2013. Journal of the Operational Research Society, 68(4), 399–415.EdvardsenD.F.FørsundF.R.KittelsenS.A.C.2017Productivity development of norwegian institutions of higher education 2004–2013Journal of the Operational Research Society68439941510.1057/s41274-017-0183-xSearch in Google Scholar

De França, J.M.F., de Figueiredo, J.N., & dos Santos Lapa, J. (2010). A DEA methodology to evaluate the impact of information asymmetry on the efficiency of not-for-profit organizations with an application to higher education in Brazil. Annals of Operations Research, 173(1), 39–56.De FrançaJ.M.F.de FigueiredoJ.N.dos Santos LapaJ.2010A DEA methodology to evaluate the impact of information asymmetry on the efficiency of not-for-profit organizations with an application to higher education in BrazilAnnals of Operations Research1731395610.1007/s10479-009-0536-1Search in Google Scholar

Development Planning Division (MOE, PRC), China Education Statistical Yearbook, People’s Education Press, Beijing, 2010, 2011, 2012, 2013, 2014.Development Planning Division (MOE, PRC), China Education Statistical Yearbook, People’s Education PressBeijing2010201120122013, 2014Search in Google Scholar

Färe, R., Grosskopf, S., Lindgren, B., & Roos, P. (1989). Productivity developments in Swedish hospitals: A malmquist output index approach. In: Data envelopment analysis: Theory, methodology and applications. Charnes A, Cooper WW, Lewin A, Seiford L (eds.) Quorum Books.FäreR.GrosskopfS.LindgrenB.RoosP.1989Productivity developments in Swedish hospitals: A malmquist output index approach. Data envelopment analysis: Theory, methodology and applicationsCharnesACooperWWLewinASeifordLQuorum BooksSearch in Google Scholar

Färe, R., Grosskopf, S., & Lovell, C.A.K. (1994). Production frontiers. Cambridge, UK: Cambridge University Press.FäreR.GrosskopfS.LovellC.A.K.1994Production frontiersCambridge, UKCambridge University PressSearch in Google Scholar

Färe, R., Grosskopf, S., & Margaritis, D. (2006). Productivity growth and convergence in the European Union. Journal of Productivity Analysis, 25(1–2), 111–141.FäreR.GrosskopfS.MargaritisD.2006Productivity growth and convergence in the European UnionJournal of Productivity Analysis251–211114110.1007/s11123-006-7134-xSearch in Google Scholar

Glass, J.C., McCallion, G., McKillop, D.G., Rasaratnama, S., & Stringer, K.S. (2006). Implications of variant efficiency measures for policy evaluations in UK higher education. Socio-Economic Planning Sciences, 40, 119 -142.GlassJ.C.McCallionG.McKillopD.G.RasaratnamaS.StringerK.S.2006Implications of variant efficiency measures for policy evaluations in UK higher educationSocio-Economic Planning Sciences4011914210.1016/j.seps.2004.10.004Search in Google Scholar

Johnes, G. (2013). Efficiency in English higher education institutions revisited: a network approach. Economics Bulletin, 33, 2698–2706.JohnesG.2013Efficiency in English higher education institutions revisited: a network approachEconomics Bulletin3326982706Search in Google Scholar

Liu, X., & Zhi, T. (2012). China is catching up in science and innovation: the experience of the Chinese academy of sciences. Science & Public Policy, 37(5), 331–342.LiuX.ZhiT.2012China is catching up in science and innovation: the experience of the Chinese academy of sciencesScience & Public Policy37533134210.3152/030234210X501162Search in Google Scholar

Luo, L.P., & Wang, D.H. (2007). A review of the Chinese higher education evaluation center. Journal of MultiDisciplinary Evaluation, 4(7), 92–93.LuoL.P.WangD.H.2007A review of the Chinese higher education evaluation centerJournal of MultiDisciplinary Evaluation479293Search in Google Scholar

Malmquist, S. (1953). Index numbers and indifference surfaces. Trabajos de Estadistica, 4, 209–242.MalmquistS.1953Index numbers and indifference surfacesTrabajos de Estadistica420924210.1007/BF03006863Search in Google Scholar

Merton, R.K. (1957). Priorities in scientific discovery: a chapter in the sociology of science. American Sociological Review, 22(6), 635–659.MertonR.K.1957Priorities in scientific discovery: a chapter in the sociology of scienceAmerican Sociological Review22663565910.2307/2089193Search in Google Scholar

Merton, R.K. (1968). The Matthew effect in science. Science, 159, 56–63.MertonR.K.1968The Matthew effect in scienceScience159566310.1126/science.159.3810.56Search in Google Scholar

Odeck, J. (2000). Assessing the relative efficiency and productivity growth of vehicle inspection services: an application of DEA and Malmquist indices. European Journal of Operational Research, 126(3), 501–514.OdeckJ.2000Assessing the relative efficiency and productivity growth of vehicle inspection services: an application of DEA and Malmquist indicesEuropean Journal of Operational Research126350151410.1016/S0377-2217(99)00305-7Search in Google Scholar

OECD (2018), Gross domestic spending on R&D (indicator). DOI: 10.1787/d8b068b4-enOECD2018Gross domestic spending on R&D (indicator)10.1787/d8b068b4-enOpen DOISearch in Google Scholar

Pastor, J.T., & Lovell, C. A.K. (2005). A global Malmquist productivity index. Economic Letter, 88, 266–271.PastorJ.T.LovellC. A.K.2005A global Malmquist productivity indexEconomic Letter8826627110.1016/j.econlet.2005.02.013Search in Google Scholar

Quan, W., Chen, B., & Shu, F. (2017). Publish or impoverish: An investigation of the monetary reward system of science in China (1999–2016). Aslib Journal of Information Management, 69(5), 486–502.QuanW.ChenB.ShuF.2017Publish or impoverish: An investigation of the monetary reward system of science in China (1999–2016)Aslib Journal of Information Management69548650210.1108/AJIM-01-2017-0014Search in Google Scholar

Research group of competitiveness analysis of Chinese basic research of National Science Library. (2015). Blue Book of Basic Research Competitiveness for China in 2015. Beijing. (in Chinese). Retrieved from https://wenku.baidu.com/view/47b52fbf6bd97f192379e964.htmlResearch group of competitiveness analysis of Chinese basic research of National Science Library2015Blue Book of Basic Research Competitiveness for China in 2015Beijing. (in Chinese)Retrieved fromhttps://wenku.baidu.com/view/47b52fbf6bd97f192379e964.htmlSearch in Google Scholar

Rousseau, S., & Rousseau, R. (1997). Data envelopment analysis as a tool for constructing scientometric indicators. Scientometrics, 40(1), 45–56.RousseauS.RousseauR.1997Data envelopment analysis as a tool for constructing scientometric indicatorsScientometrics401455610.1007/BF02459261Search in Google Scholar

Schmoch, U., & Schubert, T. (2009). Sustainability of incentives for excellent research – The German case. Scientometrics, 81, 195–218.SchmochU.SchubertT.2009Sustainability of incentives for excellent research – The German caseScientometrics8119521810.1007/s11192-009-2127-ySearch in Google Scholar

Schmoch, U., Schubert, T., Jansen, D., Heidler, R., & Görtz, R.V. (2010). How to use indicators to measure scientific performance: A balanced approach. Research Evaluation, 19(1), 2–18.SchmochU.SchubertT.JansenD.HeidlerR.GörtzR.V.2010How to use indicators to measure scientific performance: A balanced approachResearch Evaluation19121810.3152/095820210X492477Search in Google Scholar

Schubert, T. (2009). Empirical observations on new public management to increase efficiency in public research - Boon or bane? Research Policy, 38, 1225–1234.SchubertT.2009Empirical observations on new public management to increase efficiency in public research - Boon or bane? Research Policy381225123410.1016/j.respol.2009.06.007Search in Google Scholar

Schubert, T. (2014). Are there scale economies in scientific production? On the topic of locally increasing returns to scale. Scientometrics, 99, 393–408.SchubertT.2014Are there scale economies in scientific production? On the topic of locally increasing returns to scaleScientometrics9939340810.1007/s11192-013-1207-1Search in Google Scholar

Shephard, R.W. (1970). Theory of costs and production functions. Princeton, New Jersey: Princeton University Press.ShephardR.W.1970Theory of costs and production functionsPrinceton, New JerseyPrinceton University PressSearch in Google Scholar

Singh, S., & Ranjan, P. (2017). Efficiency analysis of non-homogeneous parallel sub-unit systems for the performance measurement of higher education. Annals of Operations Research, (1), 1–26. doi: 10.1007/s10479-017-2586-0SinghS.RanjanP.2017Efficiency analysis of non-homogeneous parallel sub-unit systems for the performance measurement of higher educationAnnals of Operations Research112610.1007/s10479-017-2586-0Open DOISearch in Google Scholar

Thanassoulis, E., Kortelainen, M., Johnes, G., & Johnes, J. (2011). Costs and efficiency of higher education institutions in England: A DEA analysis. Journal of the Operational Research Society, 62(7), 1282–1297.ThanassoulisE.KortelainenM.JohnesG.JohnesJ.2011Costs and efficiency of higher education institutions in England: A DEA analysisJournal of the Operational Research Society6271282129710.1057/jors.2010.68Search in Google Scholar

Tone, K. (2004). Malmquist productivity index: Efficiency change over time. In: Cooper, W.W., Seiford, L.M., Zhu, J., Handbook on Data Envelopment Analysis, Norwell Mass., Kluwer Academic Publishers.ToneK.2004Malmquist productivity index: Efficiency change over timeCooperW.W.SeifordL.M.ZhuJ.Handbook on Data Envelopment Analysis, Norwell MassKluwer Academic PublishersSearch in Google Scholar

Wolszczak-Derlacz, J. (2017). An evaluation and explanation of (in)efficiency in higher education institutions in Europe and the US with the application of two-stage semi-parametric DEA. Research Policy, 46(9), 1595–1605.Wolszczak-DerlaczJ.2017An evaluation and explanation of (in)efficiency in higher education institutions in Europe and the US with the application of two-stage semi-parametric DEAResearch Policy4691595160510.1016/j.respol.2017.07.010Search in Google Scholar

Worthington, A.C., & Lee, B.L. (2008). Efficiency, technology and productivity in Australian universities, 1998–2003. Economics of Education Review, 27, 285–298.WorthingtonA.C.LeeB.L.2008Efficiency, technology and productivity in Australian universities, 1998–2003Economics of Education Review2728529810.1016/j.econedurev.2006.09.012Search in Google Scholar

Yang, G.L., Rousseau, R., Yang, L.Y., & Liu, W.B. (2014). A study on directional returns to scale. Journal of Informetrics, 8(3), 628–641.YangG.L.RousseauR.YangL.Y.LiuW.B.2014A study on directional returns to scaleJournal of Informetrics8362864110.1016/j.joi.2014.05.004Search in Google Scholar

Yong, S.L. (1996). ‘Technology transfer’ and the research university: A search for the boundaries of university-industry collaboration. Research Policy, 25(6), 843–863.YongS.L.1996‘Technology transfer’ and the research university: A search for the boundaries of university-industry collaborationResearch Policy25684386310.1016/0048-7333(95)00857-8Search in Google Scholar

Zhang, D.Q., Banker, R.D., Li, X.X., & Liu, W.B. (2011). Performance impact of research policy at the Chinese Academy of Sciences. Research Policy, 40, 875–885.ZhangD.Q.BankerR.D.LiX.X.LiuW.B.2011Performance impact of research policy at the Chinese Academy of SciencesResearch Policy4087588510.1016/j.respol.2011.03.010Search in Google Scholar

Recommended articles from Trend MD

Plan your remote conference with Sciendo