Implementation and Evaluation of Machine Learning Algorithms in Ball Bearing Fault Detection
12 avr. 2025
À propos de cet article
Publié en ligne: 12 avr. 2025
Pages: 22 - 29
Reçu: 12 avr. 2024
Accepté: 07 mars 2025
DOI: https://doi.org/10.2478/msr-2025-0004
Mots clés
© 2025 Pavle Stepanić et al., published by Sciendo
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Fig. 1.

Fig. 2.

Fig. 3.

Fig. 4.

Fig. 5.

Fig. 6.

Fig. 7.

The optimized hyperparameters of the KNN algorithm_
Hyperparameter | Range |
---|---|
Number of neighbors | 1 |
Distance metric | Correlation |
Distance weight | Inverse |
Standardize data | true |
Accuracy | 100 % |
Hybrid ML models_
Applied ML model | Research description |
---|---|
SVM with GA | A study applied SVM combined with GA to develop optimal classifiers for distinguishing healthy and faulty bearings in ASD systems, achieving 97.5 % accuracy [ |
SVM and ANN with CWT | This study explored the use of SVM and ANN alongside CWT to analyze frame vibrations during motor start-up, achieving 96.67 % accuracy with SVM and 90 % with ANN [ |
PCA and SVDD | PCA and SVDD were used to predict bearing failures, achieving 93.45 % accuracy [ |
GA-based SVM | A GA-based kernel discriminative feature analysis was combined with one-against-all multicategory SVMs (OAA MCSVMs) for fault diagnosis in low-speed bearings, achieving the highest reported accuracy of 98.66 % [ |
FEM and WPT with SVM | A hybrid approach integrating FEM, WPT, and SVM was proposed for fault classification, achieving 81 % accuracy for inner race faults and 79 % for rolling body faults [ |
FFT-based feature extraction with SVM | The frequency domain features derived from FFT were used to train an SVM model for bearing fault classification, achieving 87.35 % accuracy [ |
The optimized hyperparameters of the SVM algorithm_
Hyperparameter | Value |
---|---|
Box constraint level | 977.88 |
Kernel scale | 1 |
Kernel function | Quadratic |
Standardize data | true |
Accuracy | 100 % |
KNN hyperparameter search range_
Hyperparameter | Range |
---|---|
Number of neighbors | 1–98 |
Distance metric | Euclidean, Cosine, Euclidean, Correlation, Chebyshev, Hamming, Minakowski, Spearman, Jaccard, City block, Mahalanobis |
Distance weight | Equal, Inverse, Squared, Inverse |
Standardize data | true, false |
SVM hyperparameter search range_
Hyperparameter | Range |
---|---|
Box constraint level | 0.001-1000 |
Kernel scale | 0.001-1000 |
Kernel function | Gaussian, Linear, Quadratic, Cubic |
Standardize data | true, false |
SVM classification_
Model No | Kernel function | Classification success rate |
---|---|---|
1 | Linear | 93.9 % |
2 | Polynomial ( |
99.5 % |
3 | RBF | 99 % |
The k-nearest neighbor (KNN) classification_
Model No | Model name | Distance metric | Distance weight | Number of neighbors | Classification success rate |
---|---|---|---|---|---|
1 | Cosine KNN | Cosine | Equal | 10 | 98.5 % |
2 | Coarse KNN | Euclidean | Equal | 100 | 74 % |
3 | Fine KNN | Euclidean | Equal | 1 | 97.4 % |
4 | Weighted KNN | Euclidean | Squared inverse | 10 | 97.4 % |
5 | Medium KNN | Euclidean | Equal | 1 | 98 % |
6 | Cubic KNN | Minkowski | Equal | 10 | 98.2 % |