1. bookVolumen 30 (2022): Edición 1 (February 2022)
Detalles de la revista
License
Formato
Revista
eISSN
1844-0835
Primera edición
17 May 2013
Calendario de la edición
1 tiempo por año
Idiomas
Inglés
access type Acceso abierto

Nearest neighbor estimates of Kaniadakis entropy

Publicado en línea: 12 Mar 2022
Volumen & Edición: Volumen 30 (2022) - Edición 1 (February 2022)
Páginas: 171 - 189
Recibido: 14 Jul 2021
Aceptado: 13 Sep 2021
Detalles de la revista
License
Formato
Revista
eISSN
1844-0835
Primera edición
17 May 2013
Calendario de la edición
1 tiempo por año
Idiomas
Inglés
Abstract

The aim of this paper is to develop new nonparametric estimators of entropy based on the kth nearest neighbor distances that are considered between n sample points, k ≤ (n − 1) being a positive integer, fixed. The Method consists in using the new estimators which were useful in order to evaluate the entropies for random vectors. As results, using the Kaniadakis entropy measure, the asymptotic unbiasedness and consistency of the estimators are proven.

Keywords

[1] Bancescu, I. (2018). Some classes of statistical distribution. Properties and applications. Analele Stiintifice ale Universitatii Ovidius Constanta, 26 (1), 43-68. Search in Google Scholar

[2] Barbu, V. (2006). Discrete-Time Semi-Markov Model for Reliability and Survival Analysis. Communications in Statistics Theory and Methods. Vol. 33, 2833-2868, https://doi.org/10.1081/STA-20003792310.1081/STA-200037923 Search in Google Scholar

[3] Barbu, V.S., Karagrigoriu, A., Makrides, A. (2020). Statistical inference for a general class of distributions with time-varying parameters. Journal of Applied Statistics. Vol. 47, 2354-2373, https://doi.org/10.1080/02664763.2020.176327110.1080/02664763.2020.1763271 Search in Google Scholar

[4] Beirlant, J., Dudewicz, E. J., Gyrfi, L., Van der Meulen, E. C. (2001) Nonpara-metric entropy estimation: an overview. Scientific Exchange Program between the Belgian Academy of Sciences and the Hungarian Academy of Sciences in the field of Mathematical Information Theory, and NATO Research Grant. Research Grant No. CRG 931030. Search in Google Scholar

[5] Botha, T., Ferreira, J., Bekker, A. (2021). Alternative Dirichlet Priors for Estimating Entropy via a Power Sum Functional. Mathematics, 9(13), 149310.3390/math9131493 Search in Google Scholar

[6] Bulinski, A., Dimitrov, D. (2021). Statistical Estimation of the Kullback-Leibler Divergence. Mathematics, 9(5), 54410.3390/math9050544 Search in Google Scholar

[7] Debreuve, E. (2009). Statistical similarity measures and k nearest neighbor estimators teamed up to handle high-dimensional descriptors in image and video processing. University of Nice-Sophia Antipolis, France. Search in Google Scholar

[8] Jiao, J., Gao, W., Han, Y. (2018). The nearest neighbor information estimator is adaptively near minimax rate optimal. 32nd Conference on Neural Information Processing Systems, Montreal, Canada. Search in Google Scholar

[9] Keller, F., Muller, E., Bohm, K. Estimating mutual information on data streams. Karlsruhe Institute of Technology, Germany and University of Antwerp, Belgium. Search in Google Scholar

[10] Kozachenko, L. F. and Leonenko, N. N. (1987). Sample estimates of entropy of a random vector. Problems of Information Transmission, 23, 95-101. Search in Google Scholar

[11] Lefvre, F., Gaic, M., Jurek, F., Keizer, S., Mairesse, F., Thompson, B., Yu, K., Young, S. (2009). k-Nearest Neighbor Monte-Carlo Control Algorithm for POMDP-based Dialogue Systems. Proceedings of the SIGDIAL 2009 Conference, The 10th Annual Meeting of the Special Interest Group on Discourse and Dialogue, 11-12 September 2009, London, UK10.3115/1708376.1708414 Search in Google Scholar

[12] Li, S., Mnatsakanov, R.M., Andrew, M.E. (2011). k-nearest neighbor based consistent entropy estimation for hypersherical distributions. Entropy, 13.10.3390/e13030650 Search in Google Scholar

[13] Panait, I. (2018). A Weighted Entropic Copula from Preliminary Knowledge of Dependence. Analele Stiintifice ale Universitatii Ovidius Constanta Seria Matematica, 26(1), 223-24010.2478/auom-2018-0014 Search in Google Scholar

[14] Popescu, P.G., Preda, V., Slusanschi, E.I. (2014). Bounds for Je reysTsallis and JensenShannonTsallis divergences. Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 413(C), pages 280-283. Search in Google Scholar

[15] Popkov, Y.S. (2021). Qualitative Properties of Randomized Maximum Entropy Estimates of Probability Density Functions. Mathematics, 9(5), 548.10.3390/math9050548 Search in Google Scholar

[16] Preda, V., Bancescu, I. (2020). Dynamics of the Group Entropy Maximization Processes and of the Relative Entropy Group Minimization Processes Based on the Speed-gradient Principle. Statistical Topics and Stochastic Models of Dependent Data with Applications, Chapter 9.10.1002/9781119779421.ch9 Search in Google Scholar

[17] Preda, V., Dedu, S., Gheorghe, C. (2015). New classes of Lorenz curves by maximizing Tsallis entropy under mean and Gini equality and inequality constraints. Physica A, Vol. 436, 925-93210.1016/j.physa.2015.05.092 Search in Google Scholar

[18] Preda, V., Dedu, S., Sheraz, M. (2014). New measure selection for Hunt-Devolder semi-Markov regime switching interest rate models. Physica A, Vol. 407, 350-35910.1016/j.physa.2014.04.011 Search in Google Scholar

[19] Sfetcu, R.-C. (2016). Tsallis and Renyi divergences of generalized Jacobi polynomials. Physica A, Vol. 460, 131-13810.1016/j.physa.2016.04.017 Search in Google Scholar

[20] Sfetcu, R.-C., Sfetcu S.-C., Preda, V. (2021). Ordering Awad-Varma Entropy and Applications to Some Stochastic Models. Mathematics, 9(3), 280, https://doi.org/10.3390/math903028010.3390/math9030280 Search in Google Scholar

[21] Sfetcu, S.C. (2021). Varma Quantile Entropy Order. Analele Stiintifice ale Universitatii Ovidius Constanta Seria Matematica, 29 (2), 249-26410.2478/auom-2021-0029 Search in Google Scholar

[22] Sheraz, M., Dedu, S., Preda, V. (2015). Entropy Measures for Assessing Volatile Markets. Elsevier, Procedia Economics and Finance, Vol. 22, 655-66210.1016/S2212-5671(15)00279-8 Search in Google Scholar

[23] Singh, H., Misra, N., Nizdo, V., Fedorowicz, A, Demchuk, E. (2003). Nearest neighbor estimates of entropy. American Journal of Mathematical and Management Sciences, Vol. 2310.1080/01966324.2003.10737616 Search in Google Scholar

[24] Singh, S., Poczos, B. (2016). Analysis of k-nearest neighbor distances with applications to entropy estimation. Carnegie Mellon University. Search in Google Scholar

[25] Sorjamaa, A., Hao, J., Lendasse, A. (2005). Mutual information and k-nearest neighbor approximator for time series prediction. Neural Network Research Centre, Helsinki University of Technology.10.1007/11550907_87 Search in Google Scholar

[26] Sricharan, K., Raich, R., Hero. A.O. k-nearest neighbor estimation of entropies with confidence. Department of EECS, Oregon State University of Michigan, Ann Arbor. Search in Google Scholar

[27] Tsybakov, A.B., van der Meulen, E.C. (1996). Root-n consistent estimators of entropy for densities with unbounded support. Scandinavian Journal of Royal Statistical Society, Series B, 38, 54-59. Search in Google Scholar

[28] Wang, X., Gui, W. (2021). Bayesian Estimation for Burr Type II Distribution under Progressive Type II Censored Data. Mathematics, 9(4), 31310.3390/math9040313 Search in Google Scholar

[29] Zamanzade, E., Arghami, N. R. (2011). Testing normality based on new entropy estimators. Journal of Statistical Computation and Simulation. Search in Google Scholar

[30] Zhao, P., Lai, L. (2018). Analysis of kNN information estimators for smooth distributions. 56th Annual Conference on Communication, Control and Computing, Monticello, USA.10.1109/ALLERTON.2018.8635874 Search in Google Scholar

Artículos recomendados de Trend MD

Planifique su conferencia remota con Sciendo