The Bandwidth from the Density Power Divergence Pak, Ro Jin;
The most widely used optimal bandwidth is known to minimize the mean integrated squared error(MISE) of a kernel density estimator from a true density. In this article proposes, we propose a bandwidth which asymptotically minimizes the mean integrated density power divergence(MIDPD) between a true density and a corresponding kernel density estimator. An approximated form of the mean integrated density power divergence is derived and a bandwidth is obtained as a product of minimization based on the approximated form. The resulting bandwidth resembles the optimal bandwidth by Parzen (1962), but it reflects the nature of a model density more than the existing optimal bandwidths. We have one more choice of an optimal bandwidth with a firm theoretical background; in addition, an empirical study we show that the bandwidth from the mean integrated density power divergence can produce a density estimator fitting a sample better than the bandwidth from the mean integrated squared error.
Density estimator;density power divergence;Kullback-Leibler divergence; distance;mean integrated square error;
Basu, A., Harris, I. R., Hjort, N. L. and Jones, M.C. (1998). Robust and efficient estimation by minimizing a density power divergence, Biometrika, 85, 549-559.
Basu, A., Mandal, A., Martin, N. and Pardo, L. (2013). Testing statistical hypotheses based on the density power divergence, Annals of the Institute of Statistical Mathematics, 65, 319-348.
Devroye, L. and Gyorfi, L. (1985). Nonparametric Density Estimation: The L1 View, Wiley, New York.
Durio, A. and Isaia, E. D. (2011). The Minimum density power divergence approach in building robust regression models, Informatica, 22, 43-56.
Fujisawa, H. and, Eguchi, F. (2006). Robust estimation in the normal mixture model, Journal of Statistical Planning and Inference, 136, 3989-4011.
Hall, P. (1987). On Kullback-Leibler loss and density estimation, The Annals of Statistics, 15, 1491-1519.
Kanazawa, Y. (1993). Hellinger distance and Kullback-Leiber loss for the kernel density estimator, Statistics and Probability Letters, 18, 315-321.
Kincaid, D. and Cheney, W. (1991). Numerical Analysis: Mathematics of Scientific Computing, Brooks/Cole, New York.
Lee, S. and Na, O. (2005). Test for parameter change based on the estimator minimizing density-based divergence measures, Annals of the Institute of Statistical Mathematics, 57, 553-573.
Marron, J. and Wand, M. (1992). Exact mean integrated squared error, The Annals of Statistics, 20, 712-736.
Parzen, E. (1962). On estimation of a probability density function and mode, The Annals of Mathematical Statistics, 33, 1065-1076.
Rosenblatt, M. (1956). Remarks on some nonparametric estimates of a density function, The Annals of Mathematical Statistics, 27, 832-837.
Silverman, B.W. (1985). Density Estimation for Statistics and Data Analysis, Chapman and Hall\CRC, New York.
Warwick, J. and Jones, M. C. J. (2005). Choosing a robustness tuning parameter, Journal of Statistical Computation and Simulation, 75, 581-588.