Performances of Shannon’s Entropy Statistic in Assessment of Distribution of Data
Publicado en línea: 20 sept 2017
Páginas: 30 - 42
Recibido: 13 jun 2017
Aceptado: 26 jul 2017
DOI: https://doi.org/10.1515/auoc-2017-0006
Palabras clave
© Ovidius University Press
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.
Statistical analysis starts with the assessment of the distribution of experimental data. Different statistics are used to test the null hypothesis (H0) stated as Data follow a certain/specified distribution. In this paper, a new test based on Shannon’s entropy (called Shannon’s entropy statistic, H1) is introduced as goodness-of-fit test. The performance of the Shannon’s entropy statistic was tested on simulated and/or experimental data with uniform and respectively four continuous distributions (as error function, generalized extreme value, lognormal, and normal). The experimental data used in the assessment were properties or activities of active chemical compounds. Five known goodness-of-fit tests namely Anderson-Darling, Kolmogorov-Smirnov, Cramér-von Mises, Kuiper V, and Watson U2 were used to accompany and assess the performances of H1.