DOI QR코드

DOI QR Code

Kullback-Leibler Information-Based Tests of Fit for Inverse Gaussian Distribution

역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정

  • Choi, Byung-Jin (Department of Applied Information Statistics, Kyonggi University)
  • 최병진 (경기대학교 응용정보통계학과)
  • Received : 20111100
  • Accepted : 20111100
  • Published : 2011.12.31

Abstract

The entropy-based test of fit for the inverse Gaussian distribution presented by Mudholkar and Tian(2002) can only be applied to the composite hypothesis that a sample is drawn from an inverse Gaussian distribution with both the location and scale parameters unknown. In application, however, a researcher may want a test of fit either for an inverse Gaussian distribution with one parameter known or for an inverse Gaussian distribution with both the two partameters known. In this paper, we introduce tests of fit for the inverse Gaussian distribution based on the Kullback-Leibler information as an extension of the entropy-based test. A window size should be chosen to implement the proposed tests. By means of Monte Carlo simulations, window sizes are determined for a wide range of sample sizes and the corresponding critical values of the test statistics are estimated. The results of power analysis for various alternatives report that the Kullback-Leibler information-based goodness-of-fit tests have good power.

본 논문에서는 위치와 척도모수가 모두 알려지지 않은 역가우스분포에 대한 적합도 검정으로 기존에 개발된 엔트로피 기반 검정을 확장한 쿨백-라이블러 정보 기반 적합도 검정을 소개한다. 역가우스분포에 대한 단순 또는 복합 영가설을 검정하기 위한 4가지 형태의 검정통계량을 제시하고 검정통계량의 계산에 사용할 표본크기에 따른 윈도크기와 기각값을 모의실험을 통해 결정하여 표의 형태로 제공한다. 검정력 분석을 위해 수행한 모의실험의 결과에서 위치와 척도모수가 모두 알려진 역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정은 모든 대립분포와 표본크기에서 EDF 검정들보다 좋은 검정력을 가지는 것으로 나타난다. 위치모수 또는 척도모수만 알려진 역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정은 모든 대립분포에 대해서 표본크기가 커짐에 따라 검정력이 증가하는 경향을 보인다. 위치와 척도모수가 모두 알려지지 않은 역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정은 대체적으로 엔트로피 기반 검정과 비슷한 수준의 검정력을 보이는 것으로 나타나고 이 결과를 통해서 두 검정은 동일함을 확인할 수 있다.

Keywords

References

  1. Chhikara, R. S. and Folks, J. L. (1989). The Inverse Gaussian Distribution: Theory, Methodology, and Applications, Marcel Dekker, New York.
  2. Cressie, N. (1976). On the logarithms of high-order spacings, Biometrika, 63, 343-355. https://doi.org/10.1093/biomet/63.2.343
  3. D'Agostino, R. B. and Stephens, M. A. (1986). Goodness-of-fit Techniques, Marcel Dekker, New York.
  4. Dudewicz, E. J. and van der Meulen, E. C. (1981). Entropy-based test for uniformity, Journal of the American Statistical Association, 76, 967-974. https://doi.org/10.2307/2287597
  5. Edgeman, R. L. (1990). Assessing the inverse Gaussian distribution assumption, IEEE Transactions on Reliability, 39, 352-355. https://doi.org/10.1109/24.103017
  6. Edgeman, R. L., Scott, R. C. and Pavur, R. J. (1988). A modified Kolmogorov-Smirnov test for the inverse density with unknown parameters, Communications in Statistics-Simulation and Computation, 17, 1203-1212. https://doi.org/10.1080/03610918808812721
  7. Ebrahimi, N., Habibullah, M. and Soofi, E. S. (1992). Testing for exponentiality based on Kullback-Leibler information, Journal of the Royal Statistical Society, Series B, 54, 739-748.
  8. Hadwinger, H. (1940). Naturliche ausscheidefunktionen fur gesamtheiten und die losung der erneurungsgleichung, Mitteilunggen der Vereinigung Schweizerischer Versicherungsmathematiker, 40, 31-49.
  9. Hall, P. (1984). Limit theorems for sums of general functions of m-spacings, Mathematical Statistics and Data Analysis, 1, 517-532.
  10. Hall, P. (1986). On powerful distributional tests on sample spacings, Journal of Multivariate Analysis, 19, 201-255. https://doi.org/10.1016/0047-259X(86)90027-8
  11. Kullback, S. and Leibler, R. A. (1951). On information and sufficiency, Annals of Mathematical Statistics, 22, 79-86. https://doi.org/10.1214/aoms/1177729694
  12. Michael, J. R., Schucany, W. R. and Hass, R. W. (1976). Generating random variables using transformation with multiple roots, The American Statistician, 30, 88-90. https://doi.org/10.2307/2683801
  13. Mudholkar, G. S. and Tian, L. (2002). An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test, Journal of Statistical Planning and Inference, 102, 211-221. https://doi.org/10.1016/S0378-3758(01)00099-4
  14. Proschan, F. (1963). Theoretical explanation of observed decreasing failure rate, Technometrics, 5, 375-384. https://doi.org/10.2307/1266340
  15. Schrodinger, E. (1915). Zur theorie der fall und steigversuche an teilchen mit Brownscher bewegung, Physikalische Zeitschrift, 16, 289-295.
  16. Seshadri, V. (1999). The Inverse Gaussian Distribution: Statistical Theory and Applications. Springer, New York.
  17. Shannon, C. E. (1948). A mathematical theory of communications, Bell System Technical Journal, 27, 379-423, 623-656. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x
  18. Smoluchowsky, M. V. (1915). Notiz uber die berechning der Brownschen molkularbewegung bei des ehrenhaft-milikanchen versuchsanordnung, Physikalische Zeitschrift, 16, 318-321.
  19. Tweedie, M. K. (1945). Inverse statistical variates, Nature, 155, 453.
  20. Tweedie, M. K. (1946). The regression of the sample variance on the sample mean, Journal of London Mathematical Society, 21, 22-28. https://doi.org/10.1112/jlms/s1-21.1.22
  21. Tweedie, M. K. (1947). Functions of a statistical variate with given means, with special reference to Laplacian distributions, Proceedings of the Cambridge Philosophical Society, 43, 41-49. https://doi.org/10.1017/S0305004100023185
  22. Tweedie, M. K. (1956). Some statistical properties of inverse Gaussian distributions, Virginia Journal of Science, 7, 160-165.
  23. Tweedie, M. K. (1957a). Statistical properties of inverse Gaussian distributions-I, Annals of Mathematical Statistics, 28, 362-377. https://doi.org/10.1214/aoms/1177706964
  24. Tweedie, M. K. (1957b). Statistical properties of inverse Gaussian distributions-II, Annals of Mathematical Statistics, 28, 696-705. https://doi.org/10.1214/aoms/1177706881
  25. van Es, B. (1992). Estimating functionals related to a density by a class of statistics based on spacings, Scandinavian Journal of Statistics, 19, 61-72.
  26. Vasicek, O. (1976). A test for normality based on sample entropy, Journal of the Royal Statistical Society, B38, 54-59.
  27. Wald, A. (1945). Sequential tests of statistical hypotheses, Annals of Mathematical Statistics, 16, 117-186. https://doi.org/10.1214/aoms/1177731118