JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Kullback-Leibler Information-Based Tests of Fit for Inverse Gaussian Distribution
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Kullback-Leibler Information-Based Tests of Fit for Inverse Gaussian Distribution
Choi, Byung-Jin;
  PDF(new window)
 Abstract
The entropy-based test of fit for the inverse Gaussian distribution presented by Mudholkar and Tian(2002) can only be applied to the composite hypothesis that a sample is drawn from an inverse Gaussian distribution with both the location and scale parameters unknown. In application, however, a researcher may want a test of fit either for an inverse Gaussian distribution with one parameter known or for an inverse Gaussian distribution with both the two partameters known. In this paper, we introduce tests of fit for the inverse Gaussian distribution based on the Kullback-Leibler information as an extension of the entropy-based test. A window size should be chosen to implement the proposed tests. By means of Monte Carlo simulations, window sizes are determined for a wide range of sample sizes and the corresponding critical values of the test statistics are estimated. The results of power analysis for various alternatives report that the Kullback-Leibler information-based goodness-of-fit tests have good power.
 Keywords
Inverse Gaussian distribution;Kullback-Leibler information;goodness-of-t test;power;
 Language
Korean
 Cited by
 References
1.
Chhikara, R. S. and Folks, J. L. (1989). The Inverse Gaussian Distribution: Theory, Methodology, and Applications, Marcel Dekker, New York.

2.
Cressie, N. (1976). On the logarithms of high-order spacings, Biometrika, 63, 343-355. crossref(new window)

3.
D'Agostino, R. B. and Stephens, M. A. (1986). Goodness-of-fit Techniques, Marcel Dekker, New York.

4.
Dudewicz, E. J. and van der Meulen, E. C. (1981). Entropy-based test for uniformity, Journal of the American Statistical Association, 76, 967-974. crossref(new window)

5.
Edgeman, R. L. (1990). Assessing the inverse Gaussian distribution assumption, IEEE Transactions on Reliability, 39, 352-355. crossref(new window)

6.
Edgeman, R. L., Scott, R. C. and Pavur, R. J. (1988). A modified Kolmogorov-Smirnov test for the inverse density with unknown parameters, Communications in Statistics-Simulation and Computation, 17, 1203-1212. crossref(new window)

7.
Ebrahimi, N., Habibullah, M. and Soofi, E. S. (1992). Testing for exponentiality based on Kullback-Leibler information, Journal of the Royal Statistical Society, Series B, 54, 739-748.

8.
Hadwinger, H. (1940). Naturliche ausscheidefunktionen fur gesamtheiten und die losung der erneurungsgleichung, Mitteilunggen der Vereinigung Schweizerischer Versicherungsmathematiker, 40, 31-49.

9.
Hall, P. (1984). Limit theorems for sums of general functions of m-spacings, Mathematical Statistics and Data Analysis, 1, 517-532.

10.
Hall, P. (1986). On powerful distributional tests on sample spacings, Journal of Multivariate Analysis, 19, 201-255. crossref(new window)

11.
Kullback, S. and Leibler, R. A. (1951). On information and sufficiency, Annals of Mathematical Statistics, 22, 79-86. crossref(new window)

12.
Michael, J. R., Schucany, W. R. and Hass, R. W. (1976). Generating random variables using transformation with multiple roots, The American Statistician, 30, 88-90. crossref(new window)

13.
Mudholkar, G. S. and Tian, L. (2002). An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test, Journal of Statistical Planning and Inference, 102, 211-221. crossref(new window)

14.
Proschan, F. (1963). Theoretical explanation of observed decreasing failure rate, Technometrics, 5, 375-384. crossref(new window)

15.
Schrodinger, E. (1915). Zur theorie der fall und steigversuche an teilchen mit Brownscher bewegung, Physikalische Zeitschrift, 16, 289-295.

16.
Seshadri, V. (1999). The Inverse Gaussian Distribution: Statistical Theory and Applications. Springer, New York.

17.
Shannon, C. E. (1948). A mathematical theory of communications, Bell System Technical Journal, 27, 379-423, 623-656. crossref(new window)

18.
Smoluchowsky, M. V. (1915). Notiz uber die berechning der Brownschen molkularbewegung bei des ehrenhaft-milikanchen versuchsanordnung, Physikalische Zeitschrift, 16, 318-321.

19.
Tweedie, M. K. (1945). Inverse statistical variates, Nature, 155, 453.

20.
Tweedie, M. K. (1946). The regression of the sample variance on the sample mean, Journal of London Mathematical Society, 21, 22-28. crossref(new window)

21.
Tweedie, M. K. (1947). Functions of a statistical variate with given means, with special reference to Laplacian distributions, Proceedings of the Cambridge Philosophical Society, 43, 41-49. crossref(new window)

22.
Tweedie, M. K. (1956). Some statistical properties of inverse Gaussian distributions, Virginia Journal of Science, 7, 160-165.

23.
Tweedie, M. K. (1957a). Statistical properties of inverse Gaussian distributions-I, Annals of Mathematical Statistics, 28, 362-377. crossref(new window)

24.
Tweedie, M. K. (1957b). Statistical properties of inverse Gaussian distributions-II, Annals of Mathematical Statistics, 28, 696-705. crossref(new window)

25.
van Es, B. (1992). Estimating functionals related to a density by a class of statistics based on spacings, Scandinavian Journal of Statistics, 19, 61-72.

26.
Vasicek, O. (1976). A test for normality based on sample entropy, Journal of the Royal Statistical Society, B38, 54-59.

27.
Wald, A. (1945). Sequential tests of statistical hypotheses, Annals of Mathematical Statistics, 16, 117-186. crossref(new window)