Cite

Figure 1

Block diagram of an ECG arrhythmia classification topic.
Block diagram of an ECG arrhythmia classification topic.

Figure 2

Flow chart of the proposed work.
Flow chart of the proposed work.

Figure 3

Normal ECG recording.
Normal ECG recording.

Figure 4

MLP Network structure.
MLP Network structure.

Figure 5

RBF Network structure.
RBF Network structure.

Figure 6

PNN Network structure.
PNN Network structure.

Selected attributes.

Selected Features Max-Q Std-a6 Std- a5 Std- a7 Std- a4 Std- a3 Std- a2 Std-a1 std- a8 Moy-d8
Correlation rank 0.350 0.326 0.318 0.314 0.311 0.310 0.307 0.306 0.304 0.300

RBF performance using different spread values.

Basic function Spread ACC(%) Test_time(s) Time_response (s) MSE
Inverse_multiquadric 10 99.9 0.121 0.643 0.250 e-30
1 99.9 0.128 0.668 0.250 e-30
0.1 99.9 0.140 0.676 0.250 e-30

MLP, RBF, and PNN Feed.Forward neural networks.

Criteria types Criterion MLP RBF PNN
Structural Architecture An input layer, one or more hidden layer and an output layer An input layer, one hidden layer and an output layer An input layer, one hidden layer, a summation layer and an output layer
Activation function The activation function is non-linear (sigmoid, log-sigmoid, tan-sigmoid.) The activation function is a radial basis function which computes the Euclidian distance of the input vector and its weights The activation function is based on the probability density function
Number of hidden neurons no defined principle for determining the number of neurons no defined principle for determining the number of neurons The number is equal to the number of instance
Output layer The final layer uses the activation function before linearly combining it The final layer doesn’t use activation function, it rather linearly combines the output of the previous neuron The final layer is a competitive output layer. It picks the maximum of the computed probabilities
Training Process Backpropagation training algorithms Backpropagation or clustering algorithms There is no computation of weights. The Bayesian decision rule
Parametric Parameters Momentum factor, learning rate, parameters according to the training algorithm Number of centers, spread of radial function Spread value of the probability density function

Performances of the PNN network according to the spread parameter.

Spread ACC(%) Test_time(s) Time_response (s) MSE
0.1 79.5 0.081 0.218 0.162
1 79.5 0.070 0.218 0.162
10 79.5 0.074 0.218 0.162

ANNs Performances.

ANN N0/HL ACC(%) Tr_time(s) Test_time(s) Time_response (s) MSE
MLP_opt 10 86.4 0.294 0.096 0.390 0.064
RBF_opt 20 99.9 0.755 0.121 0.876 0.250 e-30
PNN_opt 22 79.5 0.070 0.218 0.288 0.162

Comparative study with related works.

ANN Pre-processing Test conditions Division Datasets ACC(%)
(Abhinav-Vishwa et al., 2011) R peak MIT_BIH database of 48 signals of 30 min 50% for training and 50% for testing 96.8
(Rai et al., 2013) Morphological and DWT coefficients 45 ECG signal of 1 min from MIT-BIH database 26 signals for training and 19 for testing 97.8
(Tomar et al., 2013) Morphological, DWT coefficients, power spectral density and Energy of Periodogram 62 ECG signals of 10 s from MIT-BIH database and Normal Sinus Rhythm (NSR) Cross validation division (70%, 30%) 98.4
(Savalia et al., 2017) R peaks, the heart beats/min, the duration of complex QRS 66 ECG signal from MIT_BIH arrhythmia database and NSR database Cross validation division (70%, 30%). 82.5
(Dalvi et al., 2016) QRS complex, RR interval and the beat waveform morphology. PCA for feature selection MIT_BIH database of 48 signals of 30 min. 18 ECG records for test dataset and 30 ECG for train dataset 96.9
Proposed work Morphological and DWT coefficients 44 ECG signals of 1 min. recording from MIT_BIH database 22 signal for training and 22 for testing following the AAM.I recommendations 99.9

MLP performance with different learning algorithms.

Learning algorithms types Learning algorithms ACC(%) Tr_time(s) Test_time(s) Time_response (s) MSE
Jacobian derivatives trainlm 86.4 0.222 0.024 0.246 0.094
trainbr 74.5 0.229 0.031 0.260 0.124
Gradient derivatives trainscg 86.4 0.553 0.355 0.908 0.077
traingda 81.8 0.416 0.218 0.634 0.078
trainrp 86.4 0.294 0.096 0.390 0.073
traingdx 81.8 0.543 0.345 0.888 0.076
trainbfg 86.4 0.330 0.132 0.462 0.087
traincgb 81.8 0.316 0.118 0.434 0.070

MLP performance by learning rate.

Lr ACC(%) Time_response (s) MSE
0.01 86.4 0.390 0.130
0.1 86.4 0.399 0.064
1 86.4 0.392 0.073

RBF performance using different N/HL.

NO/HL ACC (%) Tr_time(s) Test_time(s) Time_response (s) MSE
5 54.5 0.380 0.015 0.395 0.204
10 59.1 0.436 0.013 0.449 0.150
15 81.8 0.501 0.014 0.515 0.080
20 99.9 0.640 0.014 0.654 1.266e-30
25 99.9 0.699 0.017 0.716 2.411e-30

PNN performance.

NO/HL ACC(%) Tr_time(s) Test_time(s) Time_response (s) MSE
22 79.5 0.081 0.218 0.299 0.162

MLP performance using different number of hidden layer.

NO/HL
N_HL H1 H2 ACC(%) Tr_time(s) Test_time(s) Time_response (s) MSE
5 0 72.7 0.221 0.113 0.334 0.026
10 0 86.4 0.222 0.094 0.316 0.024
1 15 0 81.8 0.380 0.099 0.479 0.021
20 0 81.8 0.390 0.073 0.463 0.016
25 0 81.8 0.406 0.071 0.477 0.013
10 5 81.8 0.315 0.238 0.553 0.019
2 10 10 81.8 0.319 0.253 0.572 0.021
10 15 72.7 0.383 0.281 0.664 0.018
10 20 72.7 0.423 0.317 0.740 0.017

RBF performance using different basic functions.

RBF functions ACC(%) Tr_time(s) Test_time(s) Time_response (s) MSE
Gaussian 99.9 0.380 0.014 0.654 1.266e-30
Polyharmonic 99.9 0.436 0.051 0.639 1.221e-30
Inverse_multiquadric 99.9 0.501 0.121 0.676 0.250e-30
Multiquadric 99.9 0.640 0.132 0.698 0.891e-30
Biharmonic 99.9 0.699 0.002 0.640 0.891e-30
eISSN:
1178-5608
Language:
English
Publication timeframe:
Volume Open
Journal Subjects:
Engineering, Introductions and Overviews, other