• Title/Summary/Keyword: Hellinger distance

Search Result 23, Processing Time 0.024 seconds

Reducing Bias of the Minimum Hellinger Distance Estimator of a Location Parameter

  • Pak, Ro-Jin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.1
    • /
    • pp.213-220
    • /
    • 2006
  • Since Beran (1977) developed the minimum Hellinger distance estimation, this method has been a popular topic in the field of robust estimation. In the process of defining a distance, a kernel density estimator has been widely used as a density estimator. In this article, however, we show that a combination of a kernel density estimator and an empirical density could result a smaller bias of the minimum Hellinger distance estimator than using just a kernel density estimator for a location parameter.

  • PDF

Minimum Hellinger Distance Bsed Goodness-of-fit Tests in Normal Models: Empirical Approach

  • Dong Bin Jeong
    • Communications for Statistical Applications and Methods
    • /
    • v.6 no.3
    • /
    • pp.967-976
    • /
    • 1999
  • In this paper we study the Hellinger distance based goodness-of-fit tests that are analogs of likelihood ratio tests. The minimum Hellinger distance estimator (MHDE) in normal models provides an excellent robust alternative to the usual maximum likelihood estimator. Our simulation results show that the Hellinger deviance test (Simpson 1989) based goodness-of-fit test is robust when data contain outliers. The proposed hellinger deviance test(Simpson 1989) is a more direcct method for obtaining robust inferences than an automated outlier screen method used before the likelihood ratio test data analysis.

  • PDF

Penalizing the Negative Exponential Disparity in Discrete Models

  • Sahadeb Sarkar;Song, Kijoung-Song;Jeong, Dong-Bin
    • Communications for Statistical Applications and Methods
    • /
    • v.5 no.2
    • /
    • pp.517-529
    • /
    • 1998
  • When the sample size is small the robust minimum Hellinger distance (HD) estimator can have substantially poor relative efficiency at the true model. Similarly, approximating the exact null distributions of the ordinary Hellinger distance tests with the limiting chi-square distributions can be quite inappropriate in small samples. To overcome these problems Harris and Basu (1994) and Basu et at. (1996) recommended using a modified HD called penalized Hellinger distance (PHD). Lindsay (1994) and Basu et al. (1997) showed that another density based distance, namely the negative exponential disparity (NED), is a major competitor to the Hellinger distance in producing an asymptotically fully efficient and robust estimator. In this paper we investigate the small sample performance of the estimates and tests based on the NED and penalized NED (PNED). Our results indicate that, in the settings considered here, the NED, unlike the HD, produces estimators that perform very well in small samples and penalizing the NED does not help. However, in testing of hypotheses, the deviance test based on a PNED appears to achieve the best small-sample level compared to tests based on the NED, HD and PHD.

  • PDF

Minimum Hellinger Distance Estimation and Minimum Density Power Divergence Estimation in Estimating Mixture Proportions

  • Pak, Ro-Jin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.4
    • /
    • pp.1159-1165
    • /
    • 2005
  • Basu et al. (1998) proposed a new density-based estimator, called the minimum density power divergence estimator (MDPDE), which avoid the use of nonparametric density estimation and associated complication such as bandwidth selection. Woodward et al. (1995) examined the minimum Hellinger distance estimator (MHDE), proposed by Beran (1977), in the case of estimation of the mixture proportion in the mixture of two normals. In this article, we introduce the MDPDE for a mixture proportion, and show that both the MDPDE and the MHDE have the same asymptotic distribution at a model. Simulation study identifies some cases where the MHDE is consistently better than the MDPDE in terms of bias.

  • PDF

Empirical Comparisons of Disparity Measures for Partial Association Models in Three Dimensional Contingency Tables

  • Jeong, D.B.;Hong, C.S.;Yoon, S.H.
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.1
    • /
    • pp.135-144
    • /
    • 2003
  • This work is concerned with comparison of the recently developed disparity measures for the partial association model in three dimensional categorical data. Data are generated by using simulation on each term in the log-linear model equation based on the partial association model, which is a proposed method in this paper. This alternative Monte Carlo methods are explored to study the behavior of disparity measures such as the power divergence statistic I(λ), the Pearson chi-square statistic X$^2$, the likelihood ratio statistic G$^2$, the blended weight chi-square statistic BWCS(λ), the blended weight Hellinger distance statistic BWHD(λ), and the negative exponential disparity statistic NED(λ) for moderate sample sizes. We find that the power divergence statistic I(2/3) and the blended weight Hellinger distance family BWHD(1/9) are the best tests with respect to size and power.

Robustness of Minimum Disparity Estimators in Linear Regression Models

  • Pak, Ro-Jin
    • Journal of the Korean Statistical Society
    • /
    • v.24 no.2
    • /
    • pp.349-360
    • /
    • 1995
  • This paper deals with the robustness properties of the minimum disparity estimation in linear regression models. The estimators defined as statistical quantities whcih minimize the blended weight Hellinger distance between a weighted kernel density estimator of the residuals and a smoothed model density of the residuals. It is shown that if the weights of the density estimator are appropriately chosen, the estimates of the regression parameters are robust.

  • PDF

SOME INEQUALITIES FOR THE $CSISZ{\acute{A}}R\;{\Phi}-DIVERGENCE$

  • Dragomir, S.S.
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.7 no.1
    • /
    • pp.63-77
    • /
    • 2003
  • Some inequalities for the $Csisz{\acute{a}}r\;{\Phi}-divergence$ and applications for the Kullback-Leibler, $R{\acute{e}}nyi$, Hellinger and Bhattacharyya distances in Information Theory are given.

  • PDF

The Estimating Equations Induced from the Minimum Dstance Estimation

  • Pak, Ro-Jin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.3
    • /
    • pp.687-696
    • /
    • 2003
  • This article presents a new family of the estimating functions related with minimum distance estimations, and discusses its relationship to the family of the minimum density power divergence estimating equations. Two representative minimum distance estimations; the minimum $L_2$ distance estimation and the minimum Hellinger distance estimation are studied in the light of the theory of estimating equations. Despite of the desirable properties of minimum distance estimations, they are not widely used by general researchers, because theories related with them are complex and are hard to be computationally implemented in real problems. Hopefully, this article would be a help for understanding the minimum distance estimations better.

  • PDF

The Minimum Squared Distance Estimator and the Minimum Density Power Divergence Estimator

  • Pak, Ro-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.6
    • /
    • pp.989-995
    • /
    • 2009
  • Basu et al. (1998) proposed the minimum divergence estimating method which is free from using the painful kernel density estimator. Their proposed class of density power divergences is indexed by a single parameter $\alpha$ which controls the trade-off between robustness and efficiency. In this article, (1) we introduce a new large class the minimum squared distance which includes from the minimum Hellinger distance to the minimum $L_2$ distance. We also show that under certain conditions both the minimum density power divergence estimator(MDPDE) and the minimum squared distance estimator(MSDE) are asymptotically equivalent and (2) in finite samples the MDPDE performs better than the MSDE in general but there are some cases where the MSDE performs better than the MDPDE when estimating a location parameter or a proportion of mixed distributions.

Negative Exponential Disparity Based Robust Estimates of Ordered Means in Normal Models

  • Bhattacharya, Bhaskar;Sarkar, Sahadeb;Jeong, Dong-Bin
    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.2
    • /
    • pp.371-383
    • /
    • 2000
  • Lindsay (1994) and Basu et al (1997) show that another density-based distance called the negative exponential disparity (NED) is an excellent competitor to the Hellinger distance (HD) in generating an asymptotically fully efficient and robust estimator. Bhattacharya and Basu (1996) consider estimation of the locations of several normal populations when an order relation between them is known to be true. They empirically show that the robust HD based weighted likelihood estimators compare favorably with the M-estimators based on Huber's $\psi$ function, the Gastworth estimator, and the trimmed mean estimator. In this paper we investigate the performance of the weighted likelihood estimator based on the NED as a robust alternative relative to that based on the HD. The NED based estimator is found to be quite competitive in the settings considered by Bhattacharya and Basu.

  • PDF