DOI QR코드

DOI QR Code

On scaled cumulative residual Kullback-Leibler information

  • Hwang, Insung (Department of Applied Statistics, Yonsei University) ;
  • Park, Sangun (Department of Applied Statistics, Yonsei University)
  • Received : 2013.08.07
  • Accepted : 2013.10.07
  • Published : 2013.11.30

Abstract

Cumulative residual Kullback-Leibler (CRKL) information is well defined on the empirical distribution function (EDF) and allows us to construct a EDF-based goodness of t test statistic. However, we need to consider a scaled CRKL because CRKL is not scale invariant. In this paper, we consider several criterions for estimating the scale parameter in the scale CRKL and compare the performances of the estimated CRKL in terms of both power and unbiasedness.

Keywords

References

  1. Balakrishnan, N., Rad, A. H. and Arghami, N. R. (2007). Testing exponentiality based on Kullback-Leibler information with progressively Type-II censored data. IEEE Transactions on Reliability, 56, 349-356. https://doi.org/10.1109/TR.2007.896682
  2. Barapour, S. and Rad, A. H. (2012). Testing goodness-of-fit for exponential distribution based on cumulative residual entropy. Communications in Statistics-Theory and Methods, 41, 1387-1396. https://doi.org/10.1080/03610926.2010.542857
  3. Park, S. (2012). Generalized Kullback-Leibler information and its extensions to censored and discrete cases. Journal of the Korean Data & Information Science Society, 23, 1223-1229. https://doi.org/10.7465/jkdi.2012.23.6.1223
  4. Park, S. (2013). On censored cumulative residual Kullback-Leibler information and goodness-of-fit test with Type II censored data. Submitted to Statistical Papers (under 2nd revision).
  5. Park, S., Rao, M. and Shin, D.W. (2012). On cumulative residual Kullback-Leibler information. Statistics and Probability Letters, 82, 2025-2032. https://doi.org/10.1016/j.spl.2012.06.015
  6. Park, S. and Shin, M. (2013). Kullback-Leibler information of Type I censored variable and its application. To appear in Statistics.
  7. Rao, M., Chen, Y., Vemuri, B.C. and Wang, F. (2004). Cumulative residual entropy: A new measure of information. IEEE Transactions on Information Theory, 50, 1220-1228. https://doi.org/10.1109/TIT.2004.828057
  8. Soofi, E. S. (2000). Principal information theoretic approaches. Journal of the American Statistical Association, 95, 1349-1353. https://doi.org/10.1080/01621459.2000.10474346

Cited by

  1. An adjusted cumulative Kullback-Leibler information with application to test of exponentiality vol.49, pp.1, 2013, https://doi.org/10.1080/03610926.2018.1529243