Cite

R. Roelofs, S. Fridovich-Keil, J. Miller, V. Shankar, M. Hardt, B. Recht, and L. Schmidt, “A metaanalysis of overfitting in machine learning,” in Proceedings of the 33rd International Conference on Neural Information Processing Systems, 2019, pp. 9179–9189. Search in Google Scholar

X. Ying, “An overview of overfitting and its solutions,” in Journal of Physics: Conference Series, vol. 1168, no. 2. IOP Publishing, 2019, p. 022022. Search in Google Scholar

M. Li, H. Wang, L. Yang, Y. Liang, Z. Shang, and H. Wan, “Fast hybrid dimensionality reduction method for classification based on feature selection and grouped feature extraction,” Expert Systems with Applications, vol. 150, p. 113277, 2020. Search in Google Scholar

H. Liu, H. Motoda, and L. Yu, “A selective sampling approach to active feature selection,” Artificial Intelligence, vol. 159, no. 1-2, pp. 49–74, 2004. Search in Google Scholar

Y. Akhiat, Y. Asnaoui, M. Chahhou, and A. Zinedine, “A new graph feature selection approach,” in 2020 6th IEEE Congress on Information Science and Technology (CiSt). IEEE, 2021, pp. 156–161. Search in Google Scholar

D. M. Atallah, M. Badawy, and A. El-Sayed, “Intelligent feature selection with modified k-nearest neighbor for kidney transplantation prediction,” SN Applied Sciences, vol. 1, no. 10, pp. 1–17, 2019. Search in Google Scholar

I. Guyon, S. Gunn, M. Nikravesh, and L. A. Zadeh, Feature extraction: foundations and applications. Springer, 2008, vol. 207. Search in Google Scholar

I. Guyon and A. Elisseeff, “An introduction to feature extraction,” in Feature extraction. Springer, 2006, pp. 1–25. Search in Google Scholar

A. Yassine, “Feature selection methods for high dimensional data,” 2021. Search in Google Scholar

Y. Manzali, Y. Akhiat, M. Chahhou, M. Elmohajir, and A. Zinedine, “Reducing the number of trees in a forest using noisy features,” Evolving Systems, pp. 1–18, 2022. Search in Google Scholar

Y. Akhiat, Y. Manzali, M. Chahhou, and A. Zinedine, “A new noisy random forest based method for feature selection,” CYBERNETICS AND INFORMATION TECHNOLOGIES, vol. 21, no. 2, 2021. Search in Google Scholar

S. Abe, “Feature selection and extraction,” in Support vector machines for pattern classification. Springer, 2010, pp. 331–341. Search in Google Scholar

J. Cai, J. Luo, S. Wang, and S. Yang, “Feature selection in machine learning: A new perspective,” Neurocomputing, vol. 300, pp. 70–79, 2018. Search in Google Scholar

Y. Akhiat, M. Chahhou, and A. Zinedine, “Feature selection based on graph representation,” in 2018 IEEE 5th International Congress on Information Science and Technology (CiSt). IEEE, 2018, pp. 232–237. Search in Google Scholar

J. C. Ang, A. Mirzal, H. Haron, and H. N. A. Hamed, “Supervised, unsupervised, and semi-supervised feature selection: a review on gene selection,” IEEE/ACM transactions on computational biology and bioinformatics, vol. 13, no. 5, pp. 971–989, 2015. Search in Google Scholar

L. A. Belanche and F. F. González, “Review and evaluation of feature selection algorithms in synthetic problems,” arXiv preprint arXiv:1101.2320, 2011. Search in Google Scholar

G. Chandrashekar and F. Sahin, “A survey on feature selection methods,” Computers & Electrical Engineering, vol. 40, no. 1, pp. 16–28, 2014. Search in Google Scholar

B. Nithya and V. Ilango, “Evaluation of machine learning based optimized feature selection approaches and classification methods for cervical cancer prediction,” SN Applied Sciences, vol. 1, no. 6, pp. 1–16, 2019. Search in Google Scholar

A. Bommert, X. Sun, B. Bischl, J. Rahnenführer, and M. Lang, “Benchmark for filter methods for feature selection in high-dimensional classification data,” Computational Statistics & Data Analysis, vol. 143, p. 106839, 2020. Search in Google Scholar

Y. Akhiat, M. Chahhou, and A. Zinedine, “Ensemble feature selection algorithm,” International Journal of Intelligent Systems and Applications, vol. 11, no. 1, p. 24, 2019. Search in Google Scholar

L. Čehovin and Z. Bosnić, “Empirical evaluation of feature selection methods in classification,” Intelligent data analysis, vol. 14, no. 3, pp. 265–281, 2010. Search in Google Scholar

Y. Asnaoui, Y. Akhiat, and A. Zinedine, “Feature selection based on attributes clustering,” in 2021 Fifth International Conference On Intelligent Computing in Data Sciences (ICDS). IEEE, 2021, pp. 1–5. Search in Google Scholar

Y. Bouchlaghem, Y. Akhiat, and S. Amjad, “Feature selection: A review and comparative study,” in E3S Web of Conferences, vol. 351. EDP Sciences, 2022, p. 01046. Search in Google Scholar

A. Destrero, S. Mosci, C. D. Mol, A. Verri, and F. Odone, “Feature selection for highdimensional data,” Computational Management Science, vol. 6, pp. 25–40, 2009. Search in Google Scholar

V. Fonti and E. Belitser, “Feature selection using lasso,” VU Amsterdam Research Paper in Business Analytics, vol. 30, pp. 1–25, 2017. Search in Google Scholar

I. Guyon and A. Elisseeff, “An introduction to variable and feature selection,” Journal of machine learning research, vol. 3, no. Mar, pp. 1157–1182, 2003. Search in Google Scholar

R. Zebari, A. Abdulazeez, D. Zeebaree, D. Zebari, and J. Saeed, “A comprehensive review of dimensionality reduction techniques for feature selection and feature extraction,” Journal of Applied Science and Technology Trends, vol. 1, no. 2, pp. 56–70, 2020. Search in Google Scholar

J. Miao and L. Niu, “A survey on feature selection,” Procedia Computer Science, vol. 91, pp. 919–926, 2016. Search in Google Scholar

L. C. Molina, L. Belanche, and À. Nebot, “Feature selection algorithms: A survey and experimental evaluation,” in 2002 IEEE International Conference on Data Mining, 2002. Proceedings. IEEE, 2002, pp. 306–313. Search in Google Scholar

R. Caruana, A. Niculescu-Mizil, G. Crew, and A. Ksikes, “Ensemble selection from libraries of models,” in Proceedings of the twenty-first international conference on Machine learning, 2004, p. 18. Search in Google Scholar

A. Yassine, C. Mohamed, and A. Zinedine, “Feature selection based on pairwise evalution,” in 2017 Intelligent Systems and Computer Vision (ISCV). IEEE, 2017, pp. 1–6. Search in Google Scholar

B. Gregorutti, B. Michel, and P. Saint-Pierre, “Correlation and variable importance in random forests,” Statistics and Computing, vol. 27, no. 3, pp. 659–678, 2017. Search in Google Scholar

J. Kacprzyk, J. W. Owsinski, and D. A. Viattchenin, “A new heuristic possibilistic clustering algorithm for feature selection,” Journal of Automation Mobile Robotics and Intelligent Systems, vol. 8, 2014. Search in Google Scholar

L. Breiman, “Random forests,” Machine learning, vol. 45, no. 1, pp. 5–32, 2001. Search in Google Scholar

H. Han, X. Guo, and H. Yu, “Variable selection using mean decrease accuracy and mean decrease gini based on random forest,” in 2016 7th ieee international conference on software engineering and service science (icsess). IEEE, 2016, pp. 219–224. Search in Google Scholar

R. Sutton and A. Barto, “Reinforcement learning: An introduction. 2017. ucl,” Computer Science Department, Reinforcement Learning Lectures, 2018. Search in Google Scholar

Y. Fenjiro and H. Benbrahim, “Deep reinforcement learning overview of the state of the art.” Journal of Automation, Mobile Robotics and Intelligent Systems, pp. 20–39, 2018. Search in Google Scholar

S. M. H. Fard, A. Hamzeh, and S. Hashemi, “Using reinforcement learning to find an optimal set of features,” Computers & Mathematics with Applications, vol. 66, no. 10, pp. 1892–1904, 2013. Search in Google Scholar

M. Lichman, “Uci machine learning repository [http://archive.ics.uci.edu/ml]. irvine, ca: University of california, school of information and computer science,” URL: http://archive.ics.uci.edu/ml, 2013. Search in Google Scholar

F. F. Provost, “T., and kohavi, r. the case against accuracy estimation for comparing classifiers,” in Proceedings of the Fifteenth International Conference on Machine Learning, 1998. Search in Google Scholar