1. bookAHEAD OF PRINT
Journal Details
License
Format
Journal
First Published
01 Jan 2016
Publication timeframe
2 times per year
Languages
English
access type Open Access

The Comprehensive Diagnostic Method Combining Rough Sets and Evidence Theory

Published Online: 25 May 2021
Page range: -
Received: 14 Jan 2021
Accepted: 26 Feb 2021
Journal Details
License
Format
Journal
First Published
01 Jan 2016
Publication timeframe
2 times per year
Languages
English
Abstract

To solve the difficulties in practice caused by the subjectivity, relativity and evidence combination focus element explosion during the process of solving the uncertain problems of fault diagnosis with evidence theory, this paper proposes a fault diagnosis inference strategy by integrating rough sets with evidence theory along with the theories of information fusion and mete-synthesis. By using rough sets, redundancy of characteristic data is removed and the unrelated essential characteristics are extracted, the objective way of basic probability assignment is proposed, and an evidence synthetic method is put forward to solve high conflict evidence. The method put forward in this paper can improve the accuracy rate of fault diagnosis with the redundant and complementary information of various faults by synthesizing all evidences with the rule of the composition of evidence theory. Besides, this paper proves the feasibility and validity of experiments and the efficiency in improving fault diagnosis.

Keywords

Introduction

Information fusion is applied widely during fault diagnosis and there are many ways. Evidence theory or D-S theory (DST) [1, 2] can compound the uncertain information from several information sources and is a very effective uncertainty reasoning way in information fusion technology. However, the evidence theory itself has some problems [3], such as dependency on the evidence provided by expert knowledge, focus element explosion caused by evidence combination, rigorous demands to combination conditions (mutual independence among evidences), inefficiency to evidential conflicts and the subjectivity in the distribution of credibility and so on. Many scholars have improved the theory to overcome these limitations, but still there are some deficiencies [4,5,6]. Rough set theory is a new mathematical tool to deal with vagueness and uncertain knowledge and is very practical. It can extract unrelated essential characteristics to eliminate redundancies without any initial or additional data [7, 8]. Both the rough set and evidence theories are math tools for uncertain information and they focus on the group ability. There is a strong complementary relationship between these two theories [9,10,11]. This paper provides a new research thought for intelligent fault diagnosis by combining the evidence theory with rough set theory to solve the subjectivity, relativity, integrated focus element explosion and conflict problems of evidences, improve the accuracy of fault diagnosis and at last carry out online diagnostics.

Basic Theories
Rough Set Theory

The paper just gives two definitions about the importance of attribute and the basic ideas about rough set theory, which are referred in the literature [12].

Definition 1

The importance of any attribute a ∈ (CR) about D is defined as: SGF(a,R,D)=γ(R{a},D)γ(R,D)SGF(a,R,D) = \gamma(R\cup\{a\},D)-\gamma(R,D)

Definition 2

To BC, if B is independent to D, and also γ(C,D) = γ(B,D), then B is a D relative reduction of C.

D-S Evidence Theory

Here the paper just gives the D-S evidence theory synthesis formula and the related knowledge, which are referred in the literature [1, 2].

Definition 3

If Bel1,Bel2,...,Beln is the belief function on the same identification framework and Θ and m1,m2,...,mn are the basic probability assignments that correspond to them, then the basic probability assignment after synthesis is: m(A)={0A=φAn=An=1Nmn(An)/Anφn=1Nmn(An)Aφ\begin{equation}m(A) =\begin{cases}0 & \quad A=\varphi\\\sum\limits_{\cap A_{n}=A}\prod\limits_{n=1}^{N} m_{n}(A_{n}) / \sum\limits_{\cap A_{n} \neq \varphi}\prod\limits_{n=1}^{N} m_{n}(A_{n}) & \quad \forall A \neq \varphi\end{cases}\end{equation}

Among them: k=An=φn=1Nmn(An)AΘk = \sum\limits_{\cap A_{n}=\varphi}\prod\limits_{n=1}^{N} m_{n}(A_{n}) \forall A \subset \Theta reflects the degree of conflict of evidences. Definition 3 is the combination rule of D-S theory (DST).

Improvement of D-S Evidence Synthesis Rule

From the D-S evidence theory synthesis formula: when k = 1, it shows that evidences conflict completely and fusion cannot be done; when 0 < k < 1, synthesis formulas can be applied, while k → 1 and evidences conflict highly, the synthesis formula will lead the perverse conclusion. Many scholars have made some significant achievements about evidence conflict probability distribution function; Sun Quan proposed a new synthesis formula by analyzing the advantages and disadvantages of evidence synthesis rules of DST, and Yager performed the synthesis of conflicts in a better way at different degrees. The synthesis formula is as follows [13]: m(φ)=0;m(A)=An=An=1Nmn(An)+f(A)Aφ;m(Θ)=An=Θn=1Nmn(An)+f(Θ)m(\varphi) = 0 ; m(A) = \sum\limits_{\cap A_{n}=A}\prod\limits_{n=1}^{N} m_{n}(A_{n}) + f(A)~~~~\forall A \neq \varphi ; m(\Theta) = \sum\limits_{\cap A_{n}=\Theta}\prod\limits_{n=1}^{N} m_{n}(A_{n}) + f(\Theta)f(A)=kεq(A)(q(A)=1n1inmi(A))f(A) = k\varepsilon q(A) (q(A)=\frac{1}{n}\sum\limits_{1 \leq i \leq n} m_{i}(A))f(Θ)=kεq(Θ)+k(1ε)f(\Theta) = k\epsilon q(\Theta) + k(1 - \varepsilon)

Among them: n is the number of evidences; the reliability of evidence ɛ is ε=ek˜(k˜=1n(n1)/2i<jkiji,jn,kij=AiAj=Ωmi(Ai)mj(Aj))\varepsilon = e^{-\tilde{k}} (\tilde{k} = \frac{1}{n(n-1)/2}\sum\limits_{i<j} k_{ij}~~~~i, j \leq n, k_{ij} = \sum\limits_{A_{i} \cap A_{j}=\Omega} m_{i}(A_{i})m_{j}(A_{j}))

Eq. (4) has some subjective factors, while the physical meaning of Eqs (2) and (3) is not clear. Since q(A) is the average supporting degree of evidence to A, evidence conflict probability k can be assigned in proportion to A. That is weighing and assigning the probability of evidential conflict according to the supporting degree of each proposition; also because of the differences of importance degree of each evidence source, the evidence weight factor is introduced. In this paper, a new evidence synthesis way is proposed: m(φ)=0;m(A)=An=An=1Nmn(An)+kq(A)Aφ;q(A)=1inwimi(A)m(\varphi) = 0 ; m(A) = \sum\limits_{\cap A_{n}=A}\prod\limits_{n=1}^{N} m_{n}(A_{n}) + kq'(A)~~~~\forall A \neq \varphi ; q'(A) = \sum\limits_{1 \leq i \leq n} w_{i} m_{i} (A)

Among them: k=An=φn=1Nmn(An)AΘk = \sum\limits_{\cap A_{n}=\varphi}\prod\limits_{n=1}^{N} m_{n}(A_{n}) \forall A \subset \Theta ; wi is the weight of No.i evidence source, and this paper gives the self-conflict of evidence to determine the evidence weight coefficient.

Definition 4

Suppose the conflict sum of evidence i and evidence j(j = 1,2,...,i−1,i+1,...,n) is the amount of self-conflict ϕi of evidence i, the expression is ϕi=j=1,jinAiAj=φmi(Ai)mj(Aj)\phi_{i} = \sum\limits_{j=1, j \neq i}^{n}~~\sum\limits_{A_{i} \cap A_{j}=\varphi} m_{i}(A_{i})m_{j}(A_{j}) mi(Ai)mj(Aj) (n is the number of evidence), then: wi=1ϕi/j=1n1ϕji=1,2,,nw_{i} = \frac{1}{\phi_{i}} / \sum\limits_{j=1}^{n} \frac{1}{\phi_{j}}~~~i=1,2,\ldots,n .

Definition 4 shows the greater the amount of self-conflict for evidence, the smaller its weight factor is. Otherwise, the smaller the amount of self-conflict for evidence, the greater its weight factor is. The weight factor shows the important degree of evidences provided by the information source in the synthesis process and the incidence of the synthetic results.

Here is an example to prove the validity of the way. Suppose Θ = {A,B,C}, three evidences are as follows:

m1 : m1(A) = 0.98, m1(B) = 0.01, m1(C) = 0.01;

m2 : m2(A) = 0, m2(B) = 0.01, m2(C) = 0.99;

m3 : m3(A) = 0.9, m3(B) = 0, m3(C) = 0.1.

Three evidences are synthesised by applying the synthetic methods of DST, Yager [14], Sun Quan, Li Bicheng [15] and the synthesis method proposed in this paper, respectively (evidence weight w1 = 0.37414, w2 = 0.21902, w3 = 0.40684) and the results are shown in Table 1. Table 1 shows the evidence synthetic method with the weight factor raised in this paper that can enhance the reasonability and reliability in the evidence combination. It can avoid the subjectivity and randomness from the weight factor by determining the evidence weight coefficient with the amount of evidence self-conflict, reflect the importance of evidence during synthesis and get a better result than other methods.

Comparison of results of various evidence synthesis methods.

Synthesis formulakm(A)m(B)m(C)m(Θ)
DST0.999010010
Yager0.99901000.000990.99901
Sun Quan0.999010.32100.00300.18800.4880
Li Bicheng0.999010.62600.00670.36730
This paper0.999010.73210.00590.26200

Application of the attribute reduction of the rough set can effectively optimise the key features as the evidence for diagnosis decisions. Attributes significance of rough set can evaluate the weight of each evidence objectively. And then there comes the formula wi=SGF(i,R,D)/jCSGF(j,R,D)w_{i} = SGF(i,R,D) /\sum\limits_{j \in C} SGF(j,R,D) .

Inference method of rough set and evidence theories

This paper creates a fault diagnosis model based on inference strategy to fuse rough set and evidence theories, as shown in Figure 1. First, the basic thought is building an information decision table by discretising the fault sample data sets with continuous attributes. Then, optimising feature parameters suitable for fault diagnosis as theory body by applying the rough set theory for attribute reduction of the decision table. In practice, for the discretised sample set to diagnose, basic probability assignment of related evidences can be calculated based on the reduction and the diagnosis result can be obtained with the inductive decision by the use of the D-S combination rule.

Fig. 1

Structure of diagnostic model integrated with evidence theory and rough set.

Discretisation of continuous data sample

A rough set theory can deal with discretised data, while original sample data are always continuous, so continuous data should be discretised first. There are many ways to discretise and each has its advantage. In practice, each field seeks a proper algorithm according to the characteristic [16, 17].

Symptom attribute reduction

Symptom attribute reduction can reduce the relativity among evidence and decide the fault with attributes as less as possible, remaining the sorting quality invariant and avoiding focus element explosion. On the other hand, the weight of each attribute can be obtained from the information in the decision table and the subjectivity from experts can be avoided, overcoming the difficulties in practical application efficiently caused by the subjectivity and relativity of evidence during the fault diagnosis with evidence theory. About the reduction algorithm of the rough set theory, still there is no recognised and high-efficient algorithm. With the completion of the reduction algorithm considered, attribute reduction is applied in this paper by combining discernibility matrix, dependability of attribute and the heuristic reduction algorithm of information entropy by improving the importance of attribute. The detailed process of this algorithm is referred in the literature [18, 19].

Basic probability assignment of evidence

Basic probability assignment of evidence is realised with the decision attribute D in the decision table as the recognition framework Θ of evidence theory and with all the condition attribute r(rR) after reduction in the decision table as the corresponding evidence. Many scholars have researched the relation between the reliability calculations of evidence theory and rough set theory [10, 11]. Since initial or additional data are not necessary for rough set theory, the basic probability assignment on rough set theory is more objective.

Theorem 1

If XU, equivalence relation U/R = X1,X2,...,Xn, then the measure of lower approximation of Xq(X) = |apr(X)|/|U|,XU (apr(.) is the lower approximation) is a belief function, the basic probability assignment is: m(xi) = |Xi|/|U|,i = 1,2,...,n; m(A) = 0,A / ∉ U/R.

Theorem 2

If Bel is a belief function meeting the needs of the following condition: the focus element of Bel is equivalence class of U; the basic probability assignment of the focus element of Bel A is: m(A) = |A|/|U|, then there exists a Pawlak rough algebra meeting with q(X) = Bel(X),XU.

The proof of Theorems 1 and 2 can be referred in the literature [10]. From Theorems 1 and 2, it is possible to calculate belief function based on a decision table with rough set theory. It provides the theoretical basis for the fusion reasoning of evidence theory and rough set theory.

The basic probability assignment m(A)=BA(1)|AB|Bel(B)m(A) = \sum\limits_{B \subseteq A} (-1)^{\lvert A - B \rvert} Bel(B) can be calculated by defining the belief function Bel(A)=BAm(B)(AΘ)Bel(A) = \sum\limits_{B \subset A} m(B)~~~~ (\forall A \subset \Theta) . From Theorem 1, the belief function is equivalent with a lower approximation. Considering the inconvenience of this expression in practical application, this paper provides a simple algorism with the potential of set: m(A)={|A|/|U|(AU/P)0(AU/P)\begin{equation}m(A) =\begin{cases}\lvert A \rvert / \lvert U \rvert & \quad (A \in U / P)\\0 & \quad (A \notin U / P)\end{cases}\end{equation}

Here P is indiscernibility relation. It is clear that Eq. (5) meets with Theorems 1 and 2.

Next, a practical algorism of basic probability assignment is given based on a rough set decision table: {Input: decision table after reduction; samples to be diagnosed}; {Output: basic probability assignments of all evidences}.

Quantification of samples to be diagnosed: to discretise the samples to be diagnosed according to the discretisation criterion of original sample data;

Recognition framework Θ is decision attribute set and the reasoning evidence is condition attributes r(rR);

To determine the division U/Θ of Θ to U;

To determine the division U/R of R to U with evidence ri as equivalence relation R;

To determine equivalence class in U/R and to determine each intersection of the equivalence and U/Θ according to the discrete value of the samples to be diagnosed in evidence ri;

To get the basic probability assignment of evidence ri to from Eq. (5).

The discount of evidence and reasoning decision

If there is only 1 − α(α ∈ [0,1]) confidence to the whole evidence, is considered as the discount rate. Considering discount to evidence, the calculation of basic probability assignment is as follows: mα(A)=(1α)m(A)AΘA0;mα(Θ)=(1α)m(Θ)+αm^{\alpha}(A)=(1-\alpha)m(A) ~~\forall A \subset \Theta~~ A \neq {\emptyset} ; m^{\alpha}(\Theta) = (1-\alpha)m(\Theta) + \alpha

The diagnosis can be concluded based on the decision of basic probability assignment after a combination of the D-S rule [20].

Diagnosis examples

The engine is an important equipment of a ship, concerning the survivability and battle effectiveness of a ship. The performance of the engine can influence and restrict the performance of the technique and tactics of a ship. The structure of the engine is complex, while the fault analysis to it is difficult. To ensure the accuracy of fault diagnosis, more characteristic parameters should be used. The number of characteristics to be extracted gets larger with the increase of various kinds of parameters and this makes the amount of information to be processed too large and cannot meet the needs of online diagnosis. Only a few key characteristics are sensitive to a fault. They are independent and provide complementary information for each other to improve the accuracy of diagnosis. While the redundant characteristics are not sensible to fault or they are related to other characteristics but useless. The rough set theory can eliminate redundant information effectively, select the key characteristics, get probability assignment and make reasoning decisions with D-S combination rule to solve the problems such as subjectivity in evidence obtaining, relativity of evidence and focus element explosion of evidence combination.

The extraction of evidence information

After the experiments of five fault phenomena appearing in a kind of ship engine and the management to the diagnostic knowledge, this paper selects 10 sets of data samples to build. The common fault information is shown in Figure 2. In Figure 2, U stands for n (n=10) states of the engine, C={cooling water temperature C1, airflow C2, fuel pressure C3, revolution speed C4, torque C5} stands for the five characteristic (symptom) parameters to describe the state of equipment, D={normal D1, high-temperature D2, airflow meter damage D3, fuel injector fault D4, ignition fault D5} shows the fault symptoms to each state of the equipment. Select three sets of data randomly as the test samples (to be diagnosed) to verify the diagnostic effects from the method proposed in this paper.

The attribute value of the sample data in Table 2 is continuous and needs to be discretised. Here a discretisation for continuous attribute value based on SOFM network classification is used and the details and the computing process of this algorithm can be referred in the literature [21]. The decision table after discretisation is shown in Table 3. After attribute reduction with the method proposed in this paper, only three condition attributes, C1, C2, C5, are kept in the decision table. They are the diagnostic proofs for five fault symptoms: cooling water temperature (E1), air flow (E2), torque (E3).

Table of fault information.

Um(C1)m(C2)m(C3)m(C4)m(C5)D
Original samples10.951.401.2051000.39D1
20.701.740.9434500.98D1
40.061.730.9839000.91D2
50.653.310.6319500.55D3
70.301.311.3243500.40D4
80.281.231.1143000.44D4
100.052.471.0138000.20D5
Test samples31.021.730.5735000.98D1
60.673.710.7719000.45D3
90.180.951.1436000.30D5

Decision table after discretization.

UC1C2C3C4C5D
Original samples131332D1
222223D1
412223D2
523112D3
721332D4
811232D4
1012221D5
Test samples332123verify D1
623112verify D3
911221verify D5

From Definition 1, the importance of attribute C1, C2, C5 about D can be obtained: SGF(C1,R,D)=γ(R{C1},D)γ(R,D)=5/7SGF(C2,R,D)=SGF(C3,R,D)=2/7SGF(C_{1},R,D) = \gamma(R\cup\{C_{1}\},D)-\gamma(R,D) = 5/7~~~~SGF(C_{2},R,D) = SGF(C_{3},R,D) = 2/7

It shows C1 is the attribute that influences the accuracy mostly to diagnostic decision rule, followed by C2 and C5. This agrees with the engineering practice. Then the weight of each evidence Ei (i=1,2,3) can be obtained from the following equation wi=SGF(i,R,D)/jCSGF(j,R,D)=(ω1,ω2,ω3)=(0.56,0.22,0.22)w_{i} = SGF(i,R,D) / \sum\limits_{j \in C}SGF(j,R,D) = (\omega_{1}, \omega_{2}, \omega_{3}) = (0.56,0.22,0.22)

Basic probability assignment and D-S evidence theory

For evidence E1, E2, E3, the basic probability assignment can be calculated with α=0.05 as the discount rate. In Table 3, the attribute values of test sample 6 and original sample 5 are the same after discretisation, so the diagnostic decision is the same necessarily and D3 need not be verified anymore. For the two sets of test samples, D1 and D5 are fused with D-S evidence theory and the combining method by improving evidence proposed in the paper, the basic probability assignments are shown in Tables 4 and 5. Θ is the overall uncertainty.

Basic probability assignment about D1.

Verify D1Km(D1)m(D2)m(D3)m(D4)m(D5)m(Θ)
E100.950000000.05
E200.31670.3167000.31670.05
E300.47500.47500000.05
E1E2DS0.91440.0397000.03970.0062
Improve DS0.60170.82690.0703000.07030.0325
E1E3DS0.95220.04330000.0045
Improve DS0.45120.88990.08500000.0251
E2E3DS0.47710.4771000.03970.0062
Improve DS0.60170.42820.4282000.11110.0325
E1E2E3DS0.94880.0469000.00390.0006
Improve DS0.79720.75340.1498000.05690.0399

Basic probability assignment about D5.

Verify D1Km(D1)m(D2)m(D3)m(D4)m(D5)m(Θ)
E1000.316700.31670.31670.05
E200.3167000.633400.05
E3000000.95000.05
E1E2DS0.05310.053100.83260.05310.0081
Improve DS0.70200.07940.174600.53390.17460.0375
E1E3DS00.039800.03980.91450.0060
Improve DS0.601700.152000.15200.66360.0324
E2E3DS0.1624000.32480.48720.0256
Improve DS0.90250.1587000.31750.47620.0476
E1E2E3DS0.02450.024500.38400.56340.0036
Improve DS0.96770.06880.170700.31910.39320.0482
The analysis and explanation of diagnosis

From Tables 4 and 5, the case cannot be diagnosed and a diagnostic error will occur if the diagnosis is with single evidence. For example, E2 and E3 in Figure 4 cannot be diagnosed, E2 in Table 5 is a diagnostic error. The diagnostic capability of single evidence is limited. So it is necessary to combine all the evidence to get a better diagnosis. The diagnostic capability can be improved because the overall uncertainty reduces in comparison with a single evidence diagnosis after two evidences are combined. But it is possible for the case that cannot be diagnosed or diagnostic errors because of the limitation of self-capability of single evidence and original training samples, for example, E2E3 in Table 4 cannot be diagnosed, and E1E2 in Table 5 is a diagnostic error. For some evidences, the diagnosis is not certain after fusion, the uncertainty is reduced gradually. After the fusion of all evidences E1, E2, E3, certainty strengthens. With one evidence or two evidences, the case cannot be diagnosed or diagnostic errors may occur, while with three evidences fused the correct diagnosis can be obtained finally.

It shows that the diagnostic capability can be improved by reducing the uncertainty efficiently by fusion reasoning for all the evidences that are not redundant.

The fusion results of the method proposed in this text are not as good as those from D-S evidence theory, according to Tables 4 and 5. Because the former is based on solving the fusion of high conflict evidences (k →1) efficiently and it affirms the openness of the recognition framework. The latter can get a good synthetic effect for normal evidences and think the current recognition framework is complete and close. The synthetic result shows that the method with weight factor considered is also effective for solving the synthesis of normal evidences.

Conclusion

An integrated diagnostic method with evidence theory and rough set fusion reasoning is put forward and it can overcome the problems of subjectivity, dependency and focus element explosion of traditional evidences. It has a significant theoretical significance and practical value.

This paper proposed a synthetic method by considering the weight factor. The method can solve the synthetic problems of the evidences with high conflict and is effective for synthesising normal evidences.

The application effect of an integrated diagnosis of rough set and evidence theories is proved to be good with diagnosis examples. With the increase of original sample data, the accuracy of diagnosis will improve. The method proposed in this text can be promoted and applied in the fault diagnosis to other devices if the sample data are large enough.

The research of this paper will be further deepened. It will focus on developing artificial intelligence diagnostic system for better application in industrial practice to achieve maximum production benefits in the direction of our subsequent research.

Fig. 1

Structure of diagnostic model integrated with evidence theory and rough set.
Structure of diagnostic model integrated with evidence theory and rough set.

Basic probability assignment about D5.

Verify D1Km(D1)m(D2)m(D3)m(D4)m(D5)m(Θ)
E1000.316700.31670.31670.05
E200.3167000.633400.05
E3000000.95000.05
E1E2DS0.05310.053100.83260.05310.0081
Improve DS0.70200.07940.174600.53390.17460.0375
E1E3DS00.039800.03980.91450.0060
Improve DS0.601700.152000.15200.66360.0324
E2E3DS0.1624000.32480.48720.0256
Improve DS0.90250.1587000.31750.47620.0476
E1E2E3DS0.02450.024500.38400.56340.0036
Improve DS0.96770.06880.170700.31910.39320.0482

Comparison of results of various evidence synthesis methods.

Synthesis formulakm(A)m(B)m(C)m(Θ)
DST0.999010010
Yager0.99901000.000990.99901
Sun Quan0.999010.32100.00300.18800.4880
Li Bicheng0.999010.62600.00670.36730
This paper0.999010.73210.00590.26200

Table of fault information.

Um(C1)m(C2)m(C3)m(C4)m(C5)D
Original samples10.951.401.2051000.39D1
20.701.740.9434500.98D1
40.061.730.9839000.91D2
50.653.310.6319500.55D3
70.301.311.3243500.40D4
80.281.231.1143000.44D4
100.052.471.0138000.20D5
Test samples31.021.730.5735000.98D1
60.673.710.7719000.45D3
90.180.951.1436000.30D5

Decision table after discretization.

UC1C2C3C4C5D
Original samples131332D1
222223D1
412223D2
523112D3
721332D4
811232D4
1012221D5
Test samples332123verify D1
623112verify D3
911221verify D5

Basic probability assignment about D1.

Verify D1Km(D1)m(D2)m(D3)m(D4)m(D5)m(Θ)
E100.950000000.05
E200.31670.3167000.31670.05
E300.47500.47500000.05
E1E2DS0.91440.0397000.03970.0062
Improve DS0.60170.82690.0703000.07030.0325
E1E3DS0.95220.04330000.0045
Improve DS0.45120.88990.08500000.0251
E2E3DS0.47710.4771000.03970.0062
Improve DS0.60170.42820.4282000.11110.0325
E1E2E3DS0.94880.0469000.00390.0006
Improve DS0.79720.75340.1498000.05690.0399

Dempester A. P. Upper and Lower Probabilities Induced by a Multi-valued Mapping, Annals Mathematical Statistics, no. 38, pp. 325–339, 1967.DempesterA. P.Upper and Lower Probabilities Induced by a Multi-valued MappingAnnals Mathematical Statistics383253391967Search in Google Scholar

Shafer G. A Mathematical Theory of Evidence. Princeton: Princeton University Press, pp. 133–185, 1976.ShaferG.A Mathematical Theory of EvidencePrincetonPrinceton University Press1331851976Search in Google Scholar

Laurie Webster, Jen-Gwo Chen, Simon S. et al. Validation of Authentic Reasoning Expert Systems, Information Sciences, no. 117, pp. 19–46, 1999.WebsterLaurieChenJen-GwoSimonS.Validation of Authentic Reasoning Expert SystemsInformation Sciences11719461999Search in Google Scholar

Zhu Lijun, Hu Zheng, Yang Yongmin. Fault Diagnosis Based on Reasoning Integration of Rough Sets and Evidence Theory, Transactions of CSICE, vol. 25, no.1, pp. 90–95, 2007.ZhuLijunHuZhengYangYongminFault Diagnosis Based on Reasoning Integration of Rough Sets and Evidence TheoryTransactions of CSICE25190952007Search in Google Scholar

Li Yanhong, Guo Haixia. Fault Diagnosis of Mine Belt Conveyor Based on Improved DS Evidence Theory, Coal Mine Machinery, vol. 41, no. 8, pp. 174–176, 2020.LiYanhongGuoHaixiaFault Diagnosis of Mine Belt Conveyor Based on Improved DS Evidence TheoryCoal Mine Machinery4181741762020Search in Google Scholar

Jia Jin Zhang, Hen Yi Nuo, Ke Ding Lin. Risk analysis of a Bayesian network for harmful chemicals road transportation systems based on fuzzy sets and improved Dempster/Shafer (DS) evidence theory, Journal of Beijing University of Chemical Technology (Natural Science), vol. 47, no. 1, pp. 38–45, 2020.ZhangJia JinNuoHen YiLinKe DingRisk analysis of a Bayesian network for harmful chemicals road transportation systems based on fuzzy sets and improved Dempster/Shafer (DS) evidence theoryJournal of Beijing University of Chemical Technology (Natural Science)47138452020Search in Google Scholar

Pawlak Z. Rough Sets, International Journal of Computer and Information Sciences, vol. 11, no. 5, pp. 341–356, 1982.PawlakZRough SetsInternational Journal of Computer and Information Sciences1153413561982Search in Google Scholar

Yunliang J, Congfu X, Jin G, et al. Research on Rough Set Theory Extension and Rough Reasoning, IEEE International Conference on Systems, Man and Cybernetics, Hague, pp. 5888–5893, 2004.YunliangJCongfuXJinGResearch on Rough Set Theory Extension and Rough ReasoningIEEE International Conference on Systems, Man and CyberneticsHague588858932004Search in Google Scholar

Yao Y Y, Lingras P J. Interpretations of Belief Functions in the Theory of Rough Sets, Information Sciences, no. 104, pp. 81–106, 1998.YaoY YLingrasP JInterpretations of Belief Functions in the Theory of Rough SetsInformation Sciences104811061998Search in Google Scholar

Skowron A, Grzymalta-Busse J. From Rough Set Theory to Evidence Theory-Advances in the Dempster-Shafer Theory of Evidence. New York: John Wiley & Sons, Inc, pp. 193–236, 1994.SkowronAGrzymalta-BusseJFrom Rough Set Theory to Evidence Theory-Advances in the Dempster-Shafer Theory of EvidenceNew YorkJohn Wiley & Sons, Inc1932361994Search in Google Scholar

Ding Han, Hou Ruichun, Ding Xiangqian. A Fault Diagnosis Method Based on Rough Set and Improved D-S Evidence Theory, Computer & Digital Engineering, vol. 47, no. 3, pp. 543–549, 2019.DingHanHouRuichunDingXiangqianA Fault Diagnosis Method Based on Rough Set and Improved D-S Evidence TheoryComputer & Digital Engineering4735435492019Search in Google Scholar

Zhang Wenxiu. The Theory and Method of Rough Sets, Beijing: Science Press, 2001.ZhangWenxiuThe Theory and Method of Rough SetsBeijingScience Press2001Search in Google Scholar

Sun Quan, Ye Xiuqing, Gu Weikang. A New Combination Rule of Evidence Theory, ACTA Electronic SINICA, vol. 28, no. 8, pp. 117–119, 2000.SunQuanYeXiuqingGuWeikangA New Combination Rule of Evidence TheoryACTA Electronic SINICA2881171192000Search in Google Scholar

Yager R R. On the D-S Framework and New Combination Rules, Information Sciences, vol. 41, no. 2, pp. 93–138, 1987.YagerR ROn the D-S Framework and New Combination RulesInformation Sciences412931381987Search in Google Scholar

Li Bicheng, Wang Bo, Wei Jun. An Efficient Combination Rule of Evidence Theory, Journal of Data Aquisition & Processing, vol. 17, no. 1, pp. 33–36, 2002.LiBichengWangBoWeiJunAn Efficient Combination Rule of Evidence TheoryJournal of Data Aquisition & Processing17133362002Search in Google Scholar

Zhao Rongyong, Zhang Hao, Li Cuiling. The Study and Application of Discretization Model for Continuous Attribute Values in Rough Set Theory, Computer Engineering and applications, vol. 41, no.8, pp. 40–42, 91, 2005.ZhaoRongyongZhangHaoLiCuilingThe Study and Application of Discretization Model for Continuous Attribute Values in Rough Set TheoryComputer Engineering and applications4184042912005Search in Google Scholar

Xu Dong, Wang Xin, Meng Yulong, etc. A Discretization Algorithm Based on Forest Optimization Network and Variable Precision Rough Set, Journal of Northwestern Polytechnical University, vol. 38, no. 2, pp. 434–441, 2020.XuDongWangXinMengYulongA Discretization Algorithm Based on Forest Optimization Network and Variable Precision Rough SetJournal of Northwestern Polytechnical University3824344412020Search in Google Scholar

Yang Guang, Wu Xiaoping, Song Yexin, etc. Muli-sensor Information Fusion Fault Diagnosis Method Based on Rough Set Theory, Systems Engineering and Electronics, vol. 31, no.8, pp. 2013–2019, 2009.YangGuangWuXiaopingSongYexinMuli-sensor Information Fusion Fault Diagnosis Method Based on Rough Set TheorySystems Engineering and Electronics318201320192009Search in Google Scholar

Yang Guang, Yu Shuofeng. Synthesized fault diagnosis method reasoned from rough set-neural network and evidence theory, Concurrency Computat Pract Exper. 2018; e4944. https://doi.org/10.1002/cpe.4944.YangGuangYuShuofengSynthesized fault diagnosis method reasoned from rough set-neural network and evidence theoryConcurrency Computat Pract Exper2018e4944https://doi.org/10.1002/cpe.4944.Search in Google Scholar

He You, Wang Guohong, Lu Dajin, etc. Multi-sensor Information Fusion with Applications, Beijing: Electronic Industry Press, 2000.HeYouWangGuohongLuDajinMulti-sensor Information Fusion with ApplicationsBeijingElectronic Industry Press2000Search in Google Scholar

Vesanto J, Alhoniemi E. Clustering of the Self-organizing Map, IEEE-Neural Networks, no. 11, pp. 586–598, 2000.VesantoJAlhoniemiEClustering of the Self-organizing MapIEEE-Neural Networks115865982000Search in Google Scholar

Recommended articles from Trend MD

Plan your remote conference with Sciendo