DOI QR코드

DOI QR Code

A Modi ed Entropy-Based Goodness-of-Fit Tes for Inverse Gaussian Distribution

역가우스분포에 대한 변형된 엔트로피 기반 적합도 검정

Choi, Byung-Jin
최병진

  • Received : 20110200
  • Accepted : 20110300
  • Published : 2011.04.30

Abstract

This paper presents a modified entropy-based test of fit for the inverse Gaussian distribution. The test is based on the entropy difference of the unknown data-generating distribution and the inverse Gaussian distribution. The entropy difference estimator used as the test statistic is obtained by employing Vasicek's sample entropy as an entropy estimator for the data-generating distribution and the uniformly minimum variance unbiased estimator as an entropy estimator for the inverse Gaussian distribution. The critical values of the test statistic empirically determined are provided in a tabular form. Monte Carlo simulations are performed to compare the proposed test with the previous entropy-based test in terms of power.

Keywords

Inverse Gaussian distribution;entropy;entropy characterization;entropy estimator;entropy-based test;power

References

  1. Ahmed, N. A. and Gokhale, D. V. (1989). Entropy expressions and their estimators for multivariate distributions, IEEE Transactions on Information Theory, 35, 688–692. https://doi.org/10.1109/18.30996
  2. Chhikara, R. S. and Folks, J. L. (1989). The Inverse Gaussian Distribution: Theory, Methodology, and Applications, Marcel Dekker, New York.
  3. Choi, B. (2006). Minimum variance unbiased estimation for the maximum entropy of the transformed inverse Gaussian random variable by $Y=X^{-1/2}$, The Korean Communications in Statistics, 13, 657–667. https://doi.org/10.5351/CKSS.2006.13.3.657
  4. Choi, B. and Kim, K. (2006). Testing goodness-of-fit for Laplace distribution based on maximum entropy, Statistics, 40, 517–531. https://doi.org/10.1080/02331880600822473
  5. Cressie, N. (1976). On the logarithms of high-order spacings, Biometrika, 63, 343–355. https://doi.org/10.1093/biomet/63.2.343
  6. Dudewicz, E. J. and van der Meulen, E. C. (1981). Entropy-based test for uniformity, Journal of the American Statistical Association, 76, 967–974. https://doi.org/10.2307/2287597
  7. Edgeman, R. L. (1990). Assessing the inverse Gaussian distribution assumption, IEEE Transactions on Reliability, 39, 352–355. https://doi.org/10.1109/24.103017
  8. Edgeman, R. L., Scott, R. C. and Pavur, R. J. (1988). A modified Kolmogorov Smirnov test for the inverse density with unknown parameters, Communications in Statistics-Simulation and Computation, 17, 1203–1212. https://doi.org/10.1080/03610918808812721
  9. Edgeman, R. L., Scott, R. C. and Pavur, R. J. (1992). Quadratic statistics for the goodness-of-fit test for the inverse Gaussian distribution, IEEE Transactions on Reliability, 41, 118–123. https://doi.org/10.1109/24.126682
  10. Gradsbteyn, I. S. and Pyzbik, I. M. (2000). Table of Integrals, Series, and Products, Academic Press, San Diego.
  11. Grzegorzewski, P. and Wieczorkowski, P. (1999). Entropy-based test goodness of-fit test for exponentiality, Communications in Statistics-Theory and Methods, 28, 1183–1202. https://doi.org/10.1080/03610929908832351
  12. Kapur, J. N. and Kesavan, H. K. (1992). Entropy Optimization Principles with Applications, Academic Press, San Diego.
  13. Lieblein, J. and Zelen, M. (1956). Statistical investigation of the fatigue life of deep groove ball bearings, Journal of Research of the National Bureau of Standards, 57, 273–316. https://doi.org/10.6028/jres.057.033
  14. Michael, J. R., Schucany, W. R. and Hass, R. W. (1976). Generating random variables using transformation with multiple roots, The American Statistician, 30, 88–90. https://doi.org/10.2307/2683801
  15. Mudholkar, G. S. and Tian, L. (2002). An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test, Journal of Statistical Planning and Inference, 102, 211–221. https://doi.org/10.1016/S0378-3758(01)00099-4
  16. O'Reilly, F. J. and Rueda, R. (1992). Goodness of fit for the inverse Gaussian distribution, The Canadian Journal of Statistics, 20, 387–397. https://doi.org/10.2307/3315609
  17. Seshadri, V. (1999). The Inverse Gaussian Distribution: Statistical Theory and Applications, Springer, New York.
  18. Shannon, C. E. (1948). A mathematical theory of communications, Bell System Technical Journal, 27, 379–423, 623–656. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x
  19. van Es, B. (1992). Estimating functionals related to a density by a class of statistics based on spacings, Scandinavian Journal of Statistics, 19, 61–72.
  20. Vasicek, O. (1976). A test for normality based on sample entropy, Journal of the Royal Statistical Society, B38, 54–59.