DOI QR코드

DOI QR Code

Modified Mass-Preserving Sample Entropy

  • Published : 2002.04.01

Abstract

In nonparametric entropy estimation, both mass and mean-preserving maximum entropy distribution (Theil, 1980) and the underlying distribution of the sample entropy (Vasicek, 1976), the most widely used entropy estimator, consist of nb mass-preserving densities based on disjoint Intervals of the simple averages of two adjacent order statistics. In this paper, we notice that those nonparametric density functions do not actually keep the mass-preserving constraint, and propose a modified sample entropy by considering the generalized 0-statistics (Kaigh and Driscoll, 1987) in averaging two adjacent order statistics. We consider the proposed estimator in a goodness of fit test for normality and compare its performance with that of the sample entropy.

Keywords

References

  1. Journal of the American Statistical Association v.77 Testing symmetry Antille, A.;Kersting, G.;Zucchini, W. https://doi.org/10.2307/2287727
  2. American Statistician v.43 A test for normality based on Kullback-Leibler information Arizono, I.;Ohta, H. https://doi.org/10.2307/2685161
  3. Biometrika v.65 Power results for tests based on high-order gaps Cressie, N. https://doi.org/10.1093/biomet/65.1.214
  4. Journal of the Royal Statistical Society v.54 Testing Exponentiality Based on Kullback-Leibler Information Ebrahimi, N.;Habibullah;Soofi, E. S.
  5. Statistics and Probability Letters v.20 Two measures of sample entropy Ebrahimi, N.;Pflughoeft, K.;Soofi, E. S. https://doi.org/10.1016/0167-7152(94)90046-9
  6. American Statistician v.41 Numerical and Graphical Data Summary Using O-Statistics Kaigh ,W. D.;Driscoll, M. F. https://doi.org/10.2307/2684314
  7. Annals of Mathematical Statistics v.22 On information and sufficiency Kullback, S.;Leibler, R. A. https://doi.org/10.1214/aoms/1177729694
  8. On the goodmess of fit test based on the Kullback-Leibler information Park, S.;Park, D.
  9. Bell System Technical Journal v.27 A mathematical theory of communications Shannon, C. E. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x
  10. Journal of the American Statistical Association v.90 Information distinguishability with application to analysis of failure data Soofi, E. S.;Ebrahimi, N.;Habibullah, M https://doi.org/10.2307/2291079
  11. Economics Letters v.5 The entropy of the maximum entropy distribution Theil, H. https://doi.org/10.1016/0165-1765(80)90089-0
  12. Journal of the Royal Statistical Society, Ser B v.38 A test for normality based on sample entropy Vasicek, O.