1. bookAHEAD OF PRINT
Détails du magazine
License
Format
Magazine
eISSN
2444-8656
Première parution
01 Jan 2016
Périodicité
2 fois par an
Langues
Anglais
access type Accès libre

Performance evaluation of college laboratories based on fusion of decision tree and BP neural network

Publié en ligne: 08 Apr 2022
Volume & Edition: AHEAD OF PRINT
Pages: -
Reçu: 06 Jan 2021
Accepté: 29 Sep 2021
Détails du magazine
License
Format
Magazine
eISSN
2444-8656
Première parution
01 Jan 2016
Périodicité
2 fois par an
Langues
Anglais
Abstract

Performance evaluation can promote the continuous improvement of the laboratories in a college. It is necessary to take into account the scientific evaluation method during the process of the performance evaluation. In this paper, a performance evaluation method based on the fusion of the decision tree and BP neural network is presented. In detail, the decision tree model is used to select performance evaluation indexes with high weight. The BP neural network was adopted aiming to reduce the impact of assessment prediction of classification by non-core factors. First, the data were pre-processed by trapezoidal membership function. Then, the decision tree was generated by the C4.5 algorithm to select the evaluation indexes with high weight. Then, the BP neural network was trained with as many samples as possible by evaluation indexes; it possesses experts’ experience which can be used to predict the performance evaluation results. The method overcomes the shortages of the separate model, eliminates the disturbance of human factors and improves the accuracy of the evaluation. Experiments show that the model is feasible and effective in performance evaluation of college laboratories. The outcomes of this work can provide a scientific evaluation method for people such as researchers, college administrators and laboratory managers. Also, this paper will help them to improve the management of laboratories and provide them with decision references for constructing the laboratories.

Keywords

Introduction

College laboratories undertake the teaching task of cultivating practical and scientific innovation ability of the students. The aim of performance evaluation is to evaluate the outcomes of the laboratories, find out both the strength and the weakness of the management and provide the administrators suggestions on continuous improvement. Performance evaluation should be considered as a very important work since it can strengthen the construction and management of the laboratories. Therefore, it is necessary to take into account a scientific evaluation method during the process of the performance evaluation.

Most of the traditional evaluation methods rely on expert’s opinions; they could more likely to be influenced by some subjective factors. In order to overcome the shortcomings of the traditional methods as well as to improve the effectiveness and accuracy, scholars attempt to introduce soft computing techniques to evaluate the performance [1, 2]. These methods mainly focus on the mathematical algorithm to quantify the task of performance evaluation. Generally speaking, they are more scientific and standardised than traditional methods. However, the existing methods are not systematic, considering the subjective factors that hard to be quantified.

A decision tree is an inductive learning technique with high speed and high veracity in prediction of classification. The BP neural network is an adaptive system which can effectively reduce the influence of subjective factors and enhance the objectivity of the evaluation results. The decision tree has advantages of clear structure and simple rules of classification, which can be used to overcome the shortcomings of the BP neural network such as slow convergence and poor interpretability. In such respect, it is recommended to take into account the method which combined the decision tree and neural network on performance evaluation of college laboratories. However, there are few research studies about this subject.

Motivated by the above consideration, a performance evaluation method based on the fusion of the decision tree and BP neural network is proposed in this paper. The decision tree is used to select performance evaluation indexes with high weight. The BP neural network was adopted aiming to reduce the impact of assessment prediction of classification by non-core factors.

The paper is structured as follows. Section 2 describes the related works on the subject. Brief details of the decision three and BP neural network are provided in Section 3. The performance evaluation method based on the fusion of the decision tree and BP neural network is considered in Section 4. A short comparative analysis of three models is given in Section 5. Section 6 describes a summary of conclusions.

Related works

The concept of performance evaluation was originally proposed by American management scientist Aubrey Daniels in 1970 [3]. Until now, scholars have proposed various methods based on different soft computing techniques, which are aimed to improve the effectiveness and accuracy of the performance evaluation. Recently, widely used assessment methods are fuzzy logic [4], fuzzy comprehensive evaluation (FCE) [5], analytic hierarchy process (AHP) [6,7], entropy weight fuzzy model [8,9], data envelopment analysis (DEA) [10,11], balanced scorecard [12, 13], decision tree [14] and BP network [15].

Guo et al. [16] constructed a multi-judgement FCE model to deal with the situation when experts have more than one choice of the evaluation. Roberti et al. [17] used AHP to quantify conservation compatibility to finding and comparing optimal retrofits for historic buildings. Jia et al. [18] used the AHP model to calculate the index weight and established performance evaluation model of relevant personnel, which solved the problems that qualitative performance evaluation had to deal with. Jing et al. [19] constructed the FCE model of laboratory safety in colleges and universities based on AHP. Wu et al. [20] established a safety evaluation index system determined by expert consultation and AHP, and the evaluation was carried out by the FCE method. Shunling et al. [21] selected FCE based on the entropy weight method to evaluate the utilisation of the instrument. Li et al. [22] proposed a comprehensive evaluation method of laboratories based on entropy weight fuzzy matter-element model. Someh et al. [23] proposed an evaluation method to measure the efficiency and ranking of medical diagnostic laboratories by applying a network DEA. Weipeng et al. [24] established a data index system of input and output of the laboratory, using DEA to make a comprehensive evaluation. Lu et al. [25] constructed a performance evaluation model based on scorecard with entropy restraint.

These approaches mentioned above were verified by experimental data, and they performed better than traditional evaluation methods. However, they are all restrictive because they often result in conflicting conclusions of efficiency due to unsuitability of the assumptions [26]. There is still room for improvement on reducing the influence of subjective factors and decreasing the uncertainty outcomes.

The decision tree technique is an inductive learning technique, which can infer classification from a set of random data samples. The decision tree algorithm, for instance, CART [27] and ID3 [28] utilises predicted classifications to label the leaves in the process of constructing the decision tree. Some further research studies have indicated that the decision tree models run quickly during the process of training data. As a result, the decision tree is often applied to constructing an evaluation model. For instance, Budiman et al. [29] conducted study on the student academic evaluation technique by using Decision Tree C4.5.

With powerful storage and self-adaptive learning capability, the BP neural network can process nonlinear data. The BP neural network works by incremental changing weights in a network that consists of elements called neurons [30]. Nevertheless, the BP neural network algorithms bring lots of noisy data, although it almost always achieves better performance in classifying novel examples. Now, the BP neural network has been applied in various fields, such as evaluation of campus safety, laboratory management, the quality of teaching and service of laboratory, etc. Zhang, Shi, Li, Lu et al. [31, 32, 33, 34] set up an evaluation model of laboratory in colleges based on the BP neural network.

Both decision tree and the BP neural network algorithm have their own shortcomings in constructing evaluation models. If only decision tree is used, it will make the error bigger and bigger, with the depth of the decision tree increasing. If using only the BP neural network is used, the choice of network structure depends mainly on subjective experience due to lacking theoretical guidance. In order to complement each other’s advantages, it is necessary to combine the two algorithms to obtain an effective and accurate performance evaluation method. Now, the methodologies that combine the two modes have been applied to various areas, such as teacher’s performance evaluation, ship collision risk assessment [35], earthquake infrasonic wave detection [36] and marketing investigation [37]. However, none of the mentioned papers focus on performance evaluation of college laboratories. To make the performance evaluation more effective and accurate, we proposed a method that combined the decision tree and neural network. The decision tree was used to reducing the dimension of neurons in the input layer, which can increase the convergence speed and improve prediction precision of the BP neural network.

Decision three and BP neural network
The decision tree algorithm
Introduction to the decision tree

The decision tree consists of decision-making nodes, state nodes, probability branches and terminal nodes [38]. In a general way, the decision tree is often constructed recursively from the top to bottom. This procedure is divided into three steps: splitting nodes, choosing which nodes are terminal nodes and assigning class labels to each terminal node.

The graphics of the decision tree is painted like tree branches according to different branches of a decision. As an inductive learning algorithm, the decision tree can classify the irregular data and show them in the forms of classification rules.

At present, widely used decision tree generation algorithms are ID3, C4.5 and C5.0. C4.5 algorithm is more efficient than others because it can achieve higher recognition rate with less nodes. In this paper, the decision tree was generated by C4.5 algorithm.

Information gain ratio

The pivotal part of the decision tree is choosing the proper test attribute to splitting nodes. Information gain ratio is used as the criterion of classification in decision tree C4.5 algorithm.

Let S stand for the current training sample set and the classification attribute A among S have m independent values. That means training sample set S is classified into m classifications, marked as Ci(i = 1,2,3,...m). |S| is the number of samples among set S, and ri is the number of samples belonging to classification Ci. Then, the entropy of different classifications among training sample set S can be expressed as follows: I(r1,r2,,rn)=i=1npilog2(pi),pi=ri|S| I({r_1},{r_2},{\kern 1pt} \ldots,{r_n}) = - \sum\limits_{i = 1}^n {p_i}{\log}_2 ({p_i}),\;{p_i} = {{{r_i}} \over {\left| S \right|}}

If the attribute A has n different values (a1,a2, ..., an), the set S can be classified into n subsets by different values of the attribute A, marked as Sj(j = 1,2, ..., n). Thus, values of the subsets among Sj are all with the same value in attribute A. Let Sij represent the subset of sample set Sj which belongs to attribute i(i = 1,2, ..., m). The nondeterminacy of Sij can be described by the expected value of information Tij. Tij=pijlog2(pij),pij=sij|sj| {T_{ij}} = - {p_{ij}}{\log}_2 ({p_{ij}}),{p_{ij}} = {{{s_{ij}}} \over {\left| {{s_j}} \right|}}

In formula (2), for each training set, Sj : p1j + p2j + ⋯ + pmj = 1. The entropy of subset with the value of aj is marked as I(s1j, s2j,...,smj). I(s1j,s2j,,smj)=i=1mpijlog(pj) I({s_{1j}},{s_{2j}}, \ldots,{s_{mj}}) = - \sum\limits_{i = 1}^m {p_{ij}}\log ({p_j})

Information gain Gain(A) is described as the difference of the entropy between before and after the splitting of attribute A. Gain(A)=I(r1,r2,,rn)E(A) Gain(A) = I({r_1},{r_2}, \ldots,{r_n}) - E(A)

In formula (4), E(A) represents the expected value of classification Ci(i = 1,2,...,m) of attribute A. E(A)=i=1mwjI(s1j,s2j,,smj),wj=(s1j+s2j++smj)|S| E(A) = - \sum\limits_{i = 1}^m {w_j}I({s_{1j}},{s_{2j,}} \ldots,{s_{mj}}),{w_j} = {{({s_{1j}} + {s_{2j}} + \cdots + {s_{mj}})} \over {\left| S \right|}}

The gain ratio Ratio(A) is given as follows: Ratio(A)=Gain(A)/Split(A) Ratio(A) = Gain(A)/Split(A)

In formula (6), Split(A) represents split information which can be calculated as follows: Split(A)=i=1mpi'log2(pi') Split(A) = - \sum\limits_{i = 1}^m p_i^{'}{\log}_2 (p_i^{'})

In formula (7), if attribute A is the classification criterion of decision tree, Split(A) is the entropy of training samples.

Decision tree generation algorithms

In the process of constructing a decision tree, the training data are always divided into subsets that contain only one single class. Consequences follow from this; it will make the tree larger and more complex, and it may over-fit to the noise in the training data. Therefore, it is necessary to prune the tree, aiming to decrease the complexity without reducing the classification accuracy. Also, it will make the decision tree more effective and the training speed much faster.

The basic pruning strategies are divided two types: into pre-pruning and post-pruning. Pre-pruning is to evaluate each node before splitting during the construction of the decision tree. If splitting fails to improve the generalisation capability of the decision tree, the splitting will be stopped, and this node will be labelled as leaf node. Post-pruning is done after the decision tree is generated. If generalisation capability is improved when a non-leaf node is replaced to leaf nodes, then the node will be replaced. In general, the under-fitting risk of post-pruning decision trees is very small, and the generalisation performance of post-pruning decision trees is usually better than that of pre-pruning decision trees. Reduced error pruning (REP) [39] is one of the post-pruning methods, which is adopted in this paper.

The BP neural network
Introduction to the BP neural network

The BP neural network is a multi-layer feed forward neural network that trains data in light of error reverse propagation algorithms. The structure of the BP neural network consists of one input layer, at least one hidden layer and one output layer. There is a great sum of neurons which connect to each other by weight at the same layer. However, there are no connections between neurons at different layers. The computational process of neural network involves two stages: forward propagation and back propagation. In the stage of forward propagation, input information is carried from the input layer to hidden layer or from the hidden layer to output layer. In this way, the nonlinear processing is accomplished by applying an active function to the summed inputs to each neuron. The back propagation will start if the desired output cannot be achieved at the output layer. The network continuously adjusts the connection weights and threshold according to the error signal, with the result that the error of the output is getting closer and closer to the requirement of the precision [40].

The construction of the BP neural network

The construction of the BP neural network is a three-step process: network initialisation, forward propagation and back propagation. The detailed steps of the algorithm are described as follows:

Step 1: Entering the learning samples (Xi, Yi) (i = 1,2,...,n), where Xi and Yi stand for the input vector of the learning samples and output vector, respectively.

Step 2: Confirming the number of neurons in each layer, establishing the connection weight matrix between two layers. M0=[mijN] {M^0} = \left[ {m_{ij}^N} \right] , where M0 is the weight matrix between the layer 1 and layers (L + 1). mijN m_{ij}^N is output node at each layer.

Step 3: Calculating the output of each node Y^=f(i=1L+1mijNIN+Ti) \hat Y = f(\sum\nolimits_{i = 1}^{L + 1} m_{ij}^N{I^N} + {T_i}) , where IN and Ti are input layer and threshold value, respectively, at each layer.

Step 4: Comparing the mean square error of each input nodes. MSE=1ni=1n(YNY^0)2 {\rm{MSE}} = {1 \over n}\sum\limits_{i = 1}^n {({Y^N} - {\hat Y^0})^2}

Step 5: Stop calculating when the MSE meets the forecast error ɛ; otherwise, go to step 6.

Step 6: Adjusting the connection weight matrix from layer 1 to layer (L + 1): δ0=(YNY^)2f(IN),ΔMo=ησoIN,M0=ΔM0+M0 {\delta ^0} = - {({Y^N} - \hat Y)^2}f({I^N}),\Delta {M^o} = \eta {\sigma ^o}{I^N},{M^0} = \Delta {M^0} + {M^0}

Step 7: Go to step 4; stop calculating when MSE meets the desired value.

Performance evaluation for college laboratories
Performance evaluation method

Generally speaking, the convergence rate of the neural network will slow down if the number of input attributes is too large, and it will give more chances to overfitting. Therefore, the input attributes must be reduced before they are input in the BP neural network. The C4.5 decision tree algorithm was used for choosing the optimal attribute set with high weight on the basis of calculating the information gain rate. Then, those attributes are regarded as input nodes during the construction of neural networks. When the network is well trained though as many as training samples, the prediction network will be obtained. After being tested and adjusted, the prediction network could be used to predict the result of the performance evaluation.

The performance evaluation method is shown in Figure 1.

Fig. 1

Procedure of performance evaluation for laboratories with the use of the decision tree and BP neural network.

The performance evaluation algorithm is described as follows:

Step 1: Determine the evaluation indexes.

Step 2: Pre-processing the evaluation indexes by the fuzzy mathematics theory, aimed to make sure that the values are in the scope [0, 1].

Step 2: Generate a decision tree using C4.5 algorithm.

Step 3: Select the evaluation indexes with high weight by decision tree.

Step 4: Determine the optimized BP neural network structure described as Section 3.2.2. And then, training the network. Evaluate indexes that obtained from step 3 and evaluation results from expert consultation are input and output of the network respectively.

Step 5: Test the network; check whether the network achieved the expected accuracy. Stop training if the expected error or the maximum number of iterations meet the desired value.

After that, the results of performance evaluation will be output if evaluation indexes are input into the BP neural network.

Evaluation system of college laboratories

The reasonable evaluation system is the prerequisite of a scientific evaluation method. After the analysis of ISO 9000 family of standards, and combing with many years’ experience in performance evaluation, we established the evaluation system consists of six primary indexes and 24 secondly indexes. The valuation system is described in Table 1.

Evaluation system of college laboratories.

The primary indexes The secondly indexes
Construction

1. Area and the environment

2. Instruments and equipment

3. Operation and maintenance

4. System and management

Laboratory team building

5. Tutors of experiment

6. Laboratory team construction

7. Personnel structure

8. Appraisal mechanism

9. Training mechanism

Experimental teaching

10. Practice ability

11. Exam of experiment

12. Report of experiment

13. Comprehensive and designed experiments

Administration system

14. System and management

15. Management tool

16. Experiment teaching material

17. Service efficiency

Laboratory safety

18. Safety measures

19. Hazmat management

20. Experimental environment protection

21. Clean and tidy

Innovation and entrepreneurship

22. Personnel structure proportions

23. Innovative entrepreneurship

24. Experiment project for college student

Pre-process the data

The data we used came from our previous research. A great deal of data has been accumulated in performance evaluation of college laboratories in our previous work. Each evaluation index is expressed as a score in the range of 0 and 100. There are two kinds of evaluation indexes, namely qualitative index and quantitative index. Qualitative index scores can be given directly according to the scoring criteria. Quantitative index scores are given by experts according to the specific conditions and performance of the laboratory. After the quantitative index scores of laboratories have been done, the expert opinions of each index are processed by the arithmetic mean method. qn=s=1TSns/T {q_n} = \sum\limits_{s = 1}^T {S_{ns}}/T where qn is the score of index n given by expert and Sns is the score of index n given by expert s. s is expert serial number, and T is number of questionnaires.

Before being processed by the decision tree, the score must be pre-processed, aimed to make it range from 0 to 1. The scores of the indexes are shown in Table 2; laboratory numbers are in the first column; the scores of indexes are in rest columns.

The scores of the indexes.

Laboratory number Index 1 Index 2 Index 3 Index 4 Index 5

1 72 84 76 72 75.6
2 77 78 85 85 81.2
3 86 94 75 86 84.3
4 87 71 82 82 81.4
5 82 63 88 76.5 80.2
6 71 83 89 80 80.6
7 85 94 92 88.5 89.6
8 81 60 68 74.5 71.6
9 79 55 74 66.5 70.2
10 91 76 90 89 87.3

For the sake of simplicity of computation, fuzzy mathematics theory was used to pre-process the data. In this case, the scores of evaluation results can be converted to a grade system. For example, if scores >85 are defined as ‘excellent (A)’, scores between 75 and 84 are defined as ‘good (B)’, scores between 75 and 84 are defined as ‘general (C)’ and scores <64 are defined as ‘poor (D)’. As a result, the scores in Table 2 can be converted to the grade system, which are shown in Table 3.

Grade system of evaluation indexes.

Laboratory number Index 1 Index 2 Index 3 Index 4 Index 5

1 C B B C B
2 B B A A B
3 A A B A B
4 A C B B B
5 B D A B B
6 C B A B B
7 A A A A A
8 B D C B C
9 B D C C C
10 A B A A A

However, it is difficult to reflect the continuity of the changing processes mentioned in Table 3. For example, when the score is 74, it was converted to ‘C’ as it is <75. If the score is 75, it will be converted to a ‘B’. It is likely to produce misunderstanding that one point difference may lead to different grades. Therefore, we must blur original data. Considering the numerical characteristics, we use trapezoidal membership functions to calculate the membership degrees.

Trapezoidal fuzzy function is a fuzzy distribution function which refers to the probability theory aimed at making the evaluation more scientific [41]. Assume ‘Excellent (A)’, ‘Good (B)’, ‘General (C)’ and ‘Poor (D)’ is a set of fuzzy terms, respectively, which is described by trapezoidal fuzzy function. The forms of trapezoidal membership function in this paper are described as follows. A(x)={0,0x81(x81)/8,81<x891,89<x100,D(x)={1,0x611(x61)/8,61<x690,69<x100B(x)={0,0x71(x71)/8,71<x791,79<x811(x81)/8,81<x890,89<x100,C(x)={0,0x61(x61)/8,61<x691,69<x711(x71)/8,71<x790,79<x100 \matrix{{A(x) = \left\{{\matrix{{0,0 \le x \le 81} \hfill \cr {(x - 81)/8,81 < x \le 89} \hfill \cr {1,89 < x \le 100} \hfill \cr}} \right.,} \hfill & {D(x) = \left\{{\matrix{{1,0 \le x \le 61} \hfill \cr {1 - (x - 61)/8,61 < x \le 69} \hfill \cr {0,69 < x \le 100} \hfill \cr}} \right.} \hfill \cr {B(x) = \left\{{\matrix{{0,0 \le x \le 71} \hfill \cr {(x - 71)/8,71 < x \le 79} \hfill \cr {1,79 < x \le 81} \hfill \cr {1 - (x - 81)/8,81 < x \le 89} \hfill \cr {0,89 < x \le 100} \hfill \cr}} \right.,} \hfill & {C(x) = \left\{{\matrix{{0,0 \le x \le 61} \hfill \cr {(x - 61)/8,61 < x \le 69} \hfill \cr {1,69 < x \le 71} \hfill \cr {1 - (x - 71)/8,71 < x \le 79} \hfill \cr {0,79 < x \le 100} \hfill \cr}} \right.} \hfill \cr}

The original data in Table 3 were processed according to the trapezoidal membership function; the results are shown in Table 4. Due to space constraints, we only listed the four values of index scores.

The data after fuzzy processed.

Index 1 Index 2 Index 3 Index 4
A B C D A B C D A B C D A B C D
0 0 1 0 0.375 0.625 0 0 0 0.625 0.375 0 0 0 1 0
0 0.875 0.125 0 0 0.875 0.125 0 0.625 0.375 0 0 0.625 0.375 0 0
0.75 0.25 0 0 1 0 0 0 0 0.625 0.375 0 0.75 0.25 0 0
0.625 0.375 0 0 0 0 1 0 0.125 0.875 0 0 0.125 0.875 0 0
0 0 0.875 0.125 0 0 0.75 0.25 0.75 0.25 0 0 0 0.625 0.375 0
0 0 1 0 0.75 0.25 0 0 1 0 0 0 0.375 0.625 0 0
Selecting the index with high weight

The decision tree was constructed using the algorithm as shown in Table 1. According to the performance evaluation system, the data samples are 24 second indexes processed by the decision tree. Every information gain ratio of the evaluation index is calculated by the C4.5 decision tree algorithm. There are 11 indexes with high weight according to the result of calculation. The information gain ratio is shown in Table 5.

Laboratory evaluation index system.

Number Indexes Information gain ratio
1 Area and environment 33.51%
2 Instruments and equipment 28.46%
3 Operation and maintenance 27.63%
4 System and management 25.321%
5 Practice ability 21.25%
6 Service efficiency 20.87%
7 Personnel structure 19.32%
8 Comprehensive and designed experiments 19.87%
9 Experiment project for college student 18.56%
10 Hazmat management 17.62%
11 Innovative entrepreneurship 16.93%
Construct the optimised BP neural network

The indexes selected by decision tree classification as the input of the neural network and the corresponding evaluation results from experts as the output. The BP neural network was constructed by using Python-based programming and TensorFlow tools.

Sigmod function was chosen as network activation function in this paper. f(x)=11+ex,x(0,1) f(x) = {1 \over {1 + {e^{- x}}}},\;x \in (0,1)

Delta machine learning rules were adopted, aimed at increasing the convergence, reducing the error rate and avoiding local minima values. The objective function of the error signal is calculated as follows: Δwij(m)=η(yi(m)Pj(m))Qi(m) \Delta {w_{ij}}(m) = \eta ({y_i}(m) - {P_j}(m)){Q_i}(m)

In formula (12), Δwij (m) represents the connection weight between neuron i and neuron j when the input vector is xm; η and yi are learning efficiency and output expectation of neuron i, respectively. Qi and Pj represent activation values of the neuron i and neuron j. Experimental results show that error range reaches to 0.0001 and η is 0.5, when the neural network was trained 800 times.

Theoretically, the BP neural network with a single hidden layer can approximate any rational function. The number of hidden layers increasing will make the network more complex, which leads to increase the training time of the neural network and reduce the training efficiency.

It has been proved theoretically by Hecht-Nielsen that any continuous function in a closed interval can be approximated by a BP network with a hidden layer. A three-layer BP network can perform any mapping from M dimension to N dimension [42].

Studies have shown that there is no mature theory to accurately determine the number of neurons in hidden layers. Therefore, the numbers of neurons in hidden layer are generally determined based on previous empirical formulas. At present, common empirical formulas are as follows: C=m+n+a,a[1,10] C = \sqrt {m + n} + a,\;a \in [1,10]

C is the number of neurons in hidden layer, m is neurons in output layer and n is neurons in input layer. Experiments show that the overall performance of the neural network is the highest when m = 1, n = 11 and a = 4. That is to say, neurons in hidden layer are 7. The structure of the BP neural network is shown in Figure 2.

Fig. 2

The structure of the BP neural network.

Verification and comparison
Test the predict accuracy

To test and predict accuracy of the trained test network, a series of experiments were carried out. The training sample set was discretised first based on information entropy. The randomisation algorithm is used to disrupt the order of the sample set. Then, 75% of the samples were selected as training samples and the remaining 25% were selected as test samples.

The comparison of real values and predicted values is shown in Table 6. The simulation results are very close to the real values, which shows that the method is feasible and effective in performance evaluation of college laboratories.

Comparison of real values and predicted values.

Laboratory No. 1 2 3 5 6 7
Real values 82 76 96 65 85 92
Predicted values 82.3 76.2 95.8 62.3 85.1 91.8
Comparison of three models

Three experiments were carried; each experiment used the same data samples to ensure the fairness of the results. The first experiment approach adopted the model of performance evaluation with use of a single decision tree, and the second experiment adopted the model with the use of a single BP neural network; the third experiment adopted the method proposed in this paper. Each experiment was repeated 50 times; the average value was selected as the experimental result. The comparison of experimental results is shown in Table 7. The average accuracy of the performance method combines the decision tree and BP neural network 86.28. The data in the table indicate that the model introduced in this article is superior to other two models, which verifies the accuracy and rationality of the evaluation model.

Comparison of experimental results.

Sequence Number of training samples Accuracy
Decision tree BP neural network Decision tree and BP neural network
1 300 76.4 73.8 81.6
2 400 78.2 72.5 82.5
3 500 81.6 77.6 84.3
4 600 82.1 79.3 86.1
5 700 81.6 80.4 88.5
6 800 82.1 81.5 86.2
7 900 83.4 82.9 89.4
8 1000 84.5 83.2 91.6
Average value 81.24 78.9 86.28

The neural network error curve comparison is shown in Figure 3. We can see from Figure 3 that the initial training error of the method is less than that of the other two methods, and it reaches the minimum error first. It shows that the classification accuracy of the model is improved.

Fig. 3

Neural network error curve comparison.

Conclusion

Finding out a scientific qualitative and quantitative evaluation method is a challenging task during the process of college laboratory performance evaluation. Although the decision tree technique is a way with high speed and high veracity in prediction of classification and BP neural network is a method with highly nonlinear mapping ability and self-learning ability, both of them have their own shortcomings in constructing evaluation models. For this reason, the evaluation model which combines the decision tree with the BP neural network is proposed in this article. The model overcomes the shortages of separate model, eliminates the disturbance of human factors and improves the accuracy of the evaluation.

The research of this paper demonstrates some obvious advantages, but it needs to be improved in some aspects as follows:

There is still room for improving the decision tree and BP neural network algorithm adopted in this paper, such as the improvement of the split index of decision tree and the improvement of convergence speed of the BP neural network.

Some evaluation indexes are difficult to quantify during the processes of evaluation, such as laboratory rules and regulations. These evaluation indexes are not selected, which may affect the results of evaluation to some extent. Therefore, the evaluation index system needs to be optimised in future works.

Fig. 1

Procedure of performance evaluation for laboratories with the use of the decision tree and BP neural network.
Procedure of performance evaluation for laboratories with the use of the decision tree and BP neural network.

Fig. 2

The structure of the BP neural network.
The structure of the BP neural network.

Fig. 3

Neural network error curve comparison.
Neural network error curve comparison.

Evaluation system of college laboratories.

The primary indexes The secondly indexes
Construction

1. Area and the environment

2. Instruments and equipment

3. Operation and maintenance

4. System and management

Laboratory team building

5. Tutors of experiment

6. Laboratory team construction

7. Personnel structure

8. Appraisal mechanism

9. Training mechanism

Experimental teaching

10. Practice ability

11. Exam of experiment

12. Report of experiment

13. Comprehensive and designed experiments

Administration system

14. System and management

15. Management tool

16. Experiment teaching material

17. Service efficiency

Laboratory safety

18. Safety measures

19. Hazmat management

20. Experimental environment protection

21. Clean and tidy

Innovation and entrepreneurship

22. Personnel structure proportions

23. Innovative entrepreneurship

24. Experiment project for college student

Grade system of evaluation indexes.

Laboratory number Index 1 Index 2 Index 3 Index 4 Index 5

1 C B B C B
2 B B A A B
3 A A B A B
4 A C B B B
5 B D A B B
6 C B A B B
7 A A A A A
8 B D C B C
9 B D C C C
10 A B A A A

The scores of the indexes.

Laboratory number Index 1 Index 2 Index 3 Index 4 Index 5

1 72 84 76 72 75.6
2 77 78 85 85 81.2
3 86 94 75 86 84.3
4 87 71 82 82 81.4
5 82 63 88 76.5 80.2
6 71 83 89 80 80.6
7 85 94 92 88.5 89.6
8 81 60 68 74.5 71.6
9 79 55 74 66.5 70.2
10 91 76 90 89 87.3

Laboratory evaluation index system.

Number Indexes Information gain ratio
1 Area and environment 33.51%
2 Instruments and equipment 28.46%
3 Operation and maintenance 27.63%
4 System and management 25.321%
5 Practice ability 21.25%
6 Service efficiency 20.87%
7 Personnel structure 19.32%
8 Comprehensive and designed experiments 19.87%
9 Experiment project for college student 18.56%
10 Hazmat management 17.62%
11 Innovative entrepreneurship 16.93%

Comparison of real values and predicted values.

Laboratory No. 1 2 3 5 6 7
Real values 82 76 96 65 85 92
Predicted values 82.3 76.2 95.8 62.3 85.1 91.8

The data after fuzzy processed.

Index 1 Index 2 Index 3 Index 4
A B C D A B C D A B C D A B C D
0 0 1 0 0.375 0.625 0 0 0 0.625 0.375 0 0 0 1 0
0 0.875 0.125 0 0 0.875 0.125 0 0.625 0.375 0 0 0.625 0.375 0 0
0.75 0.25 0 0 1 0 0 0 0 0.625 0.375 0 0.75 0.25 0 0
0.625 0.375 0 0 0 0 1 0 0.125 0.875 0 0 0.125 0.875 0 0
0 0 0.875 0.125 0 0 0.75 0.25 0.75 0.25 0 0 0 0.625 0.375 0
0 0 1 0 0.75 0.25 0 0 1 0 0 0 0.375 0.625 0 0

Comparison of experimental results.

Sequence Number of training samples Accuracy
Decision tree BP neural network Decision tree and BP neural network
1 300 76.4 73.8 81.6
2 400 78.2 72.5 82.5
3 500 81.6 77.6 84.3
4 600 82.1 79.3 86.1
5 700 81.6 80.4 88.5
6 800 82.1 81.5 86.2
7 900 83.4 82.9 89.4
8 1000 84.5 83.2 91.6
Average value 81.24 78.9 86.28

Someh N G, Pishvaee M S, Sadjadi S J, et al. A decision-making model for performance evaluation and profit sharing in a diagnostic laboratory network. Journal of Evaluation in Clinical Practice. 2020, 26(5), pp. 1498–1503. SomehN G PishvaeeM S SadjadiS J A decision-making model for performance evaluation and profit sharing in a diagnostic laboratory network Journal of Evaluation in Clinical Practice 2020 26 5 1498 1503 10.1111/jep.13336 Search in Google Scholar

Boura S A, Hesami S. Laboratory evaluation of the performance of asphalt mixtures containing biomass fillers. Road Materials & Pavement Design, 2019, pp. 1–14. BouraS A HesamiS Laboratory evaluation of the performance of asphalt mixtures containing biomass fillers Road Materials & Pavement Design 2019 1 14 Search in Google Scholar

Daniels, Aubrey. Performance management: The behavioral approach to productivity improvement. National Productivity Review, 2006, 4(3), pp. 225–236. DanielsAubrey Performance management: The behavioral approach to productivity improvement National Productivity Review 2006 4 3 225 236 10.1002/npr.4040040303 Search in Google Scholar

Tucan P, Gherman B, Major K, et al. Fuzzy Logic-Based Risk Assessment of a Parallel Robot for Elbow and Wrist Rehabilitation. International Journal of Environmental Research and Public Health, 2020, 17(2), p. 654. TucanP GhermanB MajorK Fuzzy Logic-Based Risk Assessment of a Parallel Robot for Elbow and Wrist Rehabilitation International Journal of Environmental Research and Public Health 2020 17 2 654 10.3390/ijerph17020654 Search in Google Scholar

X. Wei, X. Luo, Q. Li, J. Zhang and Z. Xu, Online Comment-Based Hotel Quality Automatic Assessment Using Improved Fuzzy Comprehensive Evaluation and Fuzzy Cognitive Map, in IEEE Transactions on Fuzzy Systems, 2015, 23(1), pp. 72–84. WeiX. LuoX. LiQ. ZhangJ. XuZ. Online Comment-Based Hotel Quality Automatic Assessment Using Improved Fuzzy Comprehensive Evaluation and Fuzzy Cognitive Map in IEEE Transactions on Fuzzy Systems 2015 23 1 72 84 10.1109/TFUZZ.2015.2390226 Search in Google Scholar

Saaty R W. The analytic hierarchy process-what it is and how it is used. Mathematical Modelling, 1987, 9(3–5), pp. 161–176. SaatyR W The analytic hierarchy process-what it is and how it is used Mathematical Modelling 1987 9 3–5 161 176 10.1016/0270-0255(87)90473-8 Search in Google Scholar

Dyer J S. Remarks on the Analytic Hierarchy Process. Management Science 1990, 36(3), pp. 249–258. DyerJ S Remarks on the Analytic Hierarchy Process Management Science 1990 36 3 249 258 10.1287/mnsc.36.3.249 Search in Google Scholar

Mon D L. Evaluating weapon system using fuzzy analytic hierarchy process based on entropy weight. IEEE International Conference on Fuzzy Systems. IEEE, 1995. MonD L Evaluating weapon system using fuzzy analytic hierarchy process based on entropy weight IEEE International Conference on Fuzzy Systems. IEEE 1995 10.1016/0165-0114(94)90052-3 Search in Google Scholar

Li X, Wang K, Liu L, et al. Application of the Entropy Weight and TOPSIS Method in Safety Evaluation of Coal Mines. Procedia Engineering, 2011, 26, pp. 2085–2091. LiX WangK LiuL Application of the Entropy Weight and TOPSIS Method in Safety Evaluation of Coal Mines Procedia Engineering 2011 26 2085 2091 10.1016/j.proeng.2011.11.2410 Search in Google Scholar

Athanassopoulos A D, Shale E. Assessing the Comparative Efficiency of Higher Education Institutions in the UK by the Means of Data Envelopment Analysis. Education Economics, 1997, 5(2), pp. 117–134. AthanassopoulosA D ShaleE Assessing the Comparative Efficiency of Higher Education Institutions in the UK by the Means of Data Envelopment Analysis Education Economics 1997 5 2 117 134 10.1080/09645299700000011 Search in Google Scholar

Abbott M, Doucouliagos C. The efficiency of Australian universities: a data envelopment analysis. Economics of Education Review, 2003, 22(1), pp. 89–97. AbbottM DoucouliagosC The efficiency of Australian universities: a data envelopment analysis Economics of Education Review 2003 22 1 89 97 10.1016/S0272-7757(01)00068-1 Search in Google Scholar

Houck M M, Speaker P J, Fleming A S, et al. The balanced scorecard: Sustainable performance assessment for forensic laboratories. Science & Justice, 2012, 52(4), pp. 209–216. HouckM M SpeakerP J FlemingA S The balanced scorecard: Sustainable performance assessment for forensic laboratories Science & Justice 2012 52 4 209 216 10.1016/j.scijus.2012.05.00623068771 Search in Google Scholar

Wynder M. Chemico: Evaluating performance based on the Balanced Scorecard. Journal of Accounting Education, 2018, 28(3–4), pp. 221–236. WynderM Chemico: Evaluating performance based on the Balanced Scorecard Journal of Accounting Education 2018 28 3–4 221 236 10.1016/j.jaccedu.2011.03.006 Search in Google Scholar

Safavian S R, Landgrebe D. A survey of decision tree classifier methodology. IEEE Transactions on Systems Man & Cybernetics, 2002, 21(3), pp. 660–674. SafavianS R LandgrebeD A survey of decision tree classifier methodology IEEE Transactions on Systems Man & Cybernetics 2002 21 3 660 674 10.1109/21.97458 Search in Google Scholar

Da-ke Wu, Chun-yan Xie, Cheng-wei Ma. The BP Network Classification Leafminer-Infected Leaves Based on the Fractal Dimension. Advanced Engineering Forum, 2011, 1(6), pp. 163–167. WuDa-ke XieChun-yan MaCheng-wei The BP Network Classification Leafminer-Infected Leaves Based on the Fractal Dimension Advanced Engineering Forum 2011 1 6 163 167 10.4028/www.scientific.net/AEF.1.163 Search in Google Scholar

Guo Hui-mei, Zheng Cai-fen, Yang xiao-bin, et al. Fuzzy Comprehensive Evaluation for the Laboratory Performance in the University under Multi-judgments Situation. Journal of Liaocheng University, 2020, 33(1), pp. 10–15. GuoHui-mei ZhengCai-fen Yangxiao-bin Fuzzy Comprehensive Evaluation for the Laboratory Performance in the University under Multi-judgments Situation Journal of Liaocheng University 2020 33 1 10 15 Search in Google Scholar

Roberti F, Oberegger U F, Lucchi E, et al. Energy retrofit and conservation of a historic building using multi-objective optimization and an analytic hierarchy process. Building Energy Efficiency, 2017. RobertiF ObereggerU F LucchiE Energy retrofit and conservation of a historic building using multi-objective optimization and an analytic hierarchy process Building Energy Efficiency 2017 10.1016/j.enbuild.2016.12.028 Search in Google Scholar

Jia X U, Pei L. Research of Teacher’s Performance Evaluation Model Based on AHP and Improved PSO-BP Neural Network. DEStech Transactions on Computer Science and Engineering, 2019, pp. 138–143. JiaX U PeiL Research of Teacher’s Performance Evaluation Model Based on AHP and Improved PSO-BP Neural Network DEStech Transactions on Computer Science and Engineering 2019 138 143 Search in Google Scholar

Jing Xiaofei, Wang Wenhe, Duan Yulong, et al. Research on university Laboratory Safety State based on fuzzy comprehensive Evaluation. Journal of Highter Educaion, 2019, (13), pp. 70–72. JingXiaofei WangWenhe DuanYulong Research on university Laboratory Safety State based on fuzzy comprehensive Evaluation Journal of Highter Educaion 2019 13 70 72 Search in Google Scholar

Wu Lirong, Qu Yalong, Cheng Weimin. Fuzzy Comprehensive Risk Evaluation and Cause Analysis of Laboratory in Universities. Research and Exploration in Laboratory, 2020, 39(2), pp. 300–303. LirongWu YalongQu WeiminCheng Fuzzy Comprehensive Risk Evaluation and Cause Analysis of Laboratory in Universities Research and Exploration in Laboratory 2020 39 2 300 303 Search in Google Scholar

Shunling Chen, Baoling Chen and Shuang Gan, Performance Evaluaiton of Measuring Instruments Based on Fuzzy Comprehensive Evaluation. Industrial Engineering and Innovation Management, 2019, (2), pp. 7–13. ChenShunling ChenBaoling GanShuang Performance Evaluaiton of Measuring Instruments Based on Fuzzy Comprehensive Evaluation Industrial Engineering and Innovation Management 2019 2 7 13 Search in Google Scholar

Li Hao, Zhou Wanhuai, Zhou Senxin. Research on Comprehensive Evaluation Method of Laboratories in Local Finance and Economics Universities. Research and Exploration in Laboratory, 2020, 39(1), pp. 279–283. LiHao ZhouWanhuai ZhouSenxin Research on Comprehensive Evaluation Method of Laboratories in Local Finance and Economics Universities Research and Exploration in Laboratory 2020 39 1 279 283 Search in Google Scholar

Someh N G, Pishvaee M S, Sadjadi S J, et al. Performance assessment of medical diagnostic laboratories: A network DEA approach. Journal of Evaluation in Clinical Practice, 2019, 26(5), pp. 1504–1511. SomehN G PishvaeeM S SadjadiS J Performance assessment of medical diagnostic laboratories: A network DEA approach Journal of Evaluation in Clinical Practice 2019 26 5 1504 1511 10.1111/jep.1333731851770 Search in Google Scholar

Weipeng Z, Qing L. Evaluation and optimization of input and output efficiency of University laboratory by data envelopment analysis. Journal of Intelligent and Fuzzy Systems, 2017, 33(5), pp. 2837–2842. WeipengZ QingL Evaluation and optimization of input and output efficiency of University laboratory by data envelopment analysis Journal of Intelligent and Fuzzy Systems 2017 33 5 2837 2842 10.3233/JIFS-169332 Search in Google Scholar

Lu Wenjie, Xu Na. Design on Performance Evaluation System of Science and Technology Innovation Platform. Journal of Hebei University of Science and Technology (Social Science), 2020, 20(2), pp. 25–32. WenjieLu NaXu Design on Performance Evaluation System of Science and Technology Innovation Platform Journal of Hebei University of Science and Technology (Social Science) 2020 20 2 25 32 Search in Google Scholar

Mona, Anvari, Mohammad, et al. Performance assessment of decision-making units using an adaptive neural network algorithm: one period case [J]. The International Journal of Advanced Manufacturing Technology, 2010, 46(9–12), 1059–1069. MonaAnvari Mohammad Performance assessment of decision-making units using an adaptive neural network algorithm: one period case [J] The International Journal of Advanced Manufacturing Technology 2010 46 9–12 1059 1069 10.1007/s00170-009-2161-1 Search in Google Scholar

Breiman L, Friedman J, Olshen R, et al. Classification and Regression Tress. Encyclopedia of Ecology, 1984, 40(3), pp. 582–588. BreimanL FriedmanJ OlshenR Classification and Regression Tress Encyclopedia of Ecology 1984 40 3 582 588 10.2307/2530946 Search in Google Scholar

Quinlan, J. R. Induction for Decision Thees. Machine Learning, 1986, 1(1), pp. 81–106. QuinlanJ. R. Induction for Decision Thees Machine Learning 1986 1 1 81 106 10.1007/BF00116251 Search in Google Scholar

Budiman E, Haviluddin, Dengan N, et al. Performance of Decision Tree C4.5 Algorithm in Student Academic Evaluation. Computational Science and Technology. 2018. BudimanE Haviluddin DenganN Performance of Decision Tree C4.5 Algorithm in Student Academic Evaluation Computational Science and Technology 2018 10.1007/978-981-10-8276-4_36 Search in Google Scholar

Rumelhart, David E, Hinton, Geoffrey E, Williams, Ronald J. Learning representations by back-propagating errors. Nature, 1986, 323(6088), pp. 533–536. RumelhartDavid E HintonGeoffrey E WilliamsRonald J Learning representations by back-propagating errors Nature 1986 323 6088 533 536 10.1038/323533a0 Search in Google Scholar

Zhang Zimin, Fan Yan Ying, et al. The university experimental technical personnel performance evaluation model based on the BP neural network. Journal of guangxi academy of sciences, 2013, 29(2), pp. 85–88. ZhangZimin Yan YingFan The university experimental technical personnel performance evaluation model based on the BP neural network Journal of guangxi academy of sciences 2013 29 2 85 88 Search in Google Scholar

Shi Jie. College teaching model based on BP neural network laboratory evaluation model. Chinese education technology and equipment, 2015, (2), pp. 123–124. ShiJie College teaching model based on BP neural network laboratory evaluation model Chinese education technology and equipment 2015 2 123 124 Search in Google Scholar

Li Junqing, Yan Lili, et al. Computer laboratory management evaluation index based on the BP neural network. Journal of laboratory research and exploration, 2011, 30(4), pp. 71–73. JunqingLi LiliYan Computer laboratory management evaluation index based on the BP neural network Journal of laboratory research and exploration 2011 30 4 71 73 Search in Google Scholar

Lu Linrui, Wu Yiping, et al. College laboratory safety evaluation model based on BP neural network and its application. Journal of laboratory research and exploration, 2013, 32(2), pp. 214–218. LinruiLu YipingWu College laboratory safety evaluation model based on BP neural network and its application Journal of laboratory research and exploration 2013 32 2 214 218 Search in Google Scholar

Yan-Fu Wang, Long-Ting Wang, Jun-Cheng Jiang, et al. Modelling ship collision risk based on the statistical analysis of historical data: A case study in Hong Kong waters. Ocean Engineering, 2020, 197(1). Yan-FuWang Long-TingWang Jun-ChengJiang Modelling ship collision risk based on the statistical analysis of historical data: A case study in Hong Kong waters Ocean Engineering 2020 197 1 10.1016/j.oceaneng.2019.106869 Search in Google Scholar

Yun Wu, Zuoxun Zeng. A rapid detection method of earthquake infrasonic wave based on decision-making tree and the BP neural network. International Journal of Information and Communication, 2019, 14(3), pp. 295–307. WuYun ZengZuoxun A rapid detection method of earthquake infrasonic wave based on decision-making tree and the BP neural network International Journal of Information and Communication 2019 14 3 295 307 10.1504/IJICT.2019.099113 Search in Google Scholar

Bing Xiang Liu, Xu Dong Wu, Ying Xi Li, et al. Marketing Model Investigation of Cable Television Service Based on BP Neural Network and Decision Tree. Advanced Materials Research, 2013, 2, pp. 1757–1761. LiuBing Xiang WuXu Dong LiYing Xi Marketing Model Investigation of Cable Television Service Based on BP Neural Network and Decision Tree Advanced Materials Research 2013 2 1757 1761 10.4028/www.scientific.net/AMR.774-776.1757 Search in Google Scholar

Safavian S R, Landgrebe D. A survey of decision tree classifier methodology. IEEE Transactions on Systems Man & Cybernetics, 2002, 21(3), pp. 660–674. SafavianS R LandgrebeD A survey of decision tree classifier methodology IEEE Transactions on Systems Man & Cybernetics 2002 21 3 660 674 10.1109/21.97458 Search in Google Scholar

W. N. H. W. Mohamed, M. N. M. Salleh and A. H. Omar, A comparative study of Reduced Error Pruning method in decision tree algorithms, 2012 IEEE International Conference on Control System, Computing and Engineering, Penang, Malaysia, 2012, pp. 392–397. MohamedW. H. W. SallehM. N. M. OmarA. H. A comparative study of Reduced Error Pruning method in decision tree algorithms 2012 IEEE International Conference on Control System, Computing and Engineering Penang, Malaysia 2012 392 397 10.1109/ICCSCE.2012.6487177 Search in Google Scholar

Wang S T, Chen D Y, Hou P G, et al. Determination of the Sodium Methylparaben Content Based on Spectrum Fluorescence Spectral Technology and GA-BP Neural Network [J]. Guang Pu Xue Yu Guang Pu Fen XI, 2015, 35(6), pp. 1606–1610. WangS T ChenD Y HouP G Determination of the Sodium Methylparaben Content Based on Spectrum Fluorescence Spectral Technology and GA-BP Neural Network [J] Guang Pu Xue Yu Guang Pu Fen XI 2015 35 6 1606 1610 Search in Google Scholar

József Dombi, Tamás Jónás. Ranking trapezoidal fuzzy numbers using a parametric relation pair. Fuzzy Sets and Systems, 2020. DombiJózsef JónásTamás Ranking trapezoidal fuzzy numbers using a parametric relation pair Fuzzy Sets and Systems 2020 10.1016/j.fss.2020.04.014 Search in Google Scholar

Hecht-Nielsen, Theory of the backpropagation neural network, International 1989 Joint Conference on Neural Networks, Washington, DC, USA, vol. 1, 1989, pp. 593–605. Hecht-Nielsen Theory of the backpropagation neural network International 1989 Joint Conference on Neural Networks Washington, DC, USA 1 1989 593 605 10.1109/IJCNN.1989.118638 Search in Google Scholar

Articles recommandés par Trend MD

Planifiez votre conférence à distance avec Sciendo