Algorithm for the L_{1}-Regression Estimation with High Breakdown Point

- Journal title : Communications for Statistical Applications and Methods
- Volume 17, Issue 4, 2010, pp.541-550
- Publisher : The Korean Statistical Society
- DOI : 10.5351/CKSS.2010.17.4.541

Title & Authors

Algorithm for the L_{1}-Regression Estimation with High Breakdown Point

Kim, Bu-Yong;

Kim, Bu-Yong;

Abstract

The -regression estimator is susceptible to the leverage points, even though it is highly robust to the vertical outliers. This article is concerned with the improvement of robustness of the -estimator. To improve its robustness, in terms of the breakdown point, we attempt to dampen the influence of the leverage points by means of reducing the weights corresponding to the leverage points. In addition the algorithm employs the linear scaling transformation technique, for higher computational efficiency with the large data sets, to solve the linear programming problem of -estimation. Monte Carlo simulation results indicate that the proposed algorithm yields -estimates which are robust to the leverage points as well as the vertical outliers.

Keywords

-estimation;vertical outlier;leverage point;robustness;breakdown point;

Language

Korean

References

1.

Armstrong, R. D., Frome, E. L. and Kung, D. S. (1979). A revised simplex algorithm for the absolute deviation curve fitting problem, Communications in Statistics - Simulation and Computation, 8, 175-190.

2.

Barrodale, I. and Roberts, F. D. K. (1973). An improved algorithm for discrete linear approximation, SIAM Journal on Numerical Analysis, 10, 839-848.

3.

Bassett, G. and Koenker, R. (1978). Asymptotic theory of least absolute error regression, Journal of the American Statistical Association, 73, 618-622.

4.

Blattberg, R. and Sargent, T. (1971). Regression with non-Gaussian stable disturbances; some sampling results, Econometrica, 39, 501-510.

5.

Bloomfield, P. and Steiger, W. (1980). Least absolute deviations curve-fitting, SIAM Journal on Scientific Computing, 1, 290-301.

6.

Chen, X. R. and Wu, Y. (1993). On a necessary condition for the consistency of the $L_1$ -estimates in linear regression models, Communications in Statistics - Theory and Methods, 22, 631-639.

7.

Coleman, T. F. and Li, Y. (1992). A globally and quadratically convergent affine scaling method for linear problems, Mathematical Programming, 56, 189-222.

8.

Dielman, T. E. (2005). Least absolute value regression: recent contributions, Journal of Statistical Computation and Simulation, 75, 263-286.

9.

Dielman, T. E. and Pfaffenberger, R. (1982). LAV estimation in linear regression; a review, TIMS/Studies in the Management Sciences, 19, 31-52.

10.

Dielman, T. E. and Pfaffenberger, R. (1992). A further comparison of tests of hypothesis in LAV regression, Computational Statistics & Data Analysis, 14, 375-384.

11.

Gentle, J. E., Narula, S. C. and Sposito, V. A. (1987). Algorithms for unconstrained $L_1$ linear regression, In Statistical Data Analysis based on the $L_1$ -norm and Related Methods, edited by Y. Dodge, North-Holland, 83-94.

12.

Hadi, A. S. (1994). A modification of a method for the detection of outliers in multivariate samples, Journal of the Royal Statistical Society, 56, 393-396.

13.

Hardin, J. and Rocke, D. M. (2004). Outlier detection in the multiple cluster setting using the minimum covariance determinant estimator, Computational Statistics & Data Analysis, 44, 625-638.

14.

Kim, B. Y. (1995). On the robustness of $L_1$ -estimator in linear regression models, The Korean Communications in Statistics, 2, 277-287.

15.

Kim, B. Y. (2004). Resampling-based hypothesis test in $L_1$ -regression, The Korean Communications in Statistics, 11, 643-655.

16.

Koenker, R. (1987). A comparison of asymptotic testing methods for $L_1$ -regression, In Statistical Data Analysis based on the $L_1$ -norm and Related Methods, ed. by Y. Dodge. 287-298.

17.

Montgomery, D. C., Peck, E. A. and Vining, G. G. (2006). Introduction to Linear Regression Analysis, John Wiley & Sons, New Jersey.

18.

Pfaffenberger, R. C. and Dinkel, J. J. (1978). Absolute deviations curve fitting; An alternative to least squares, In Contributions to Survey Sampling and Applied Statistics, edited by H. A. David, Academic Press, New York, 279-294.

19.

Rosenberg, B. and Carson, D. (1977). A simple approximation of the sampling distribution of least absolute residuals regression estimates, Communications in Statistics - Simulation and Computation, 6, 421-437.

20.

Rousseeuw, P. J. (1985). Multivariate estimation with high breakdown point, Mathematical Statistics and Applications, B, ed. by W. Grossmann, G. Pflug, I. Vincze, and W. Werz.

21.

Rousseeuw, P. J. and Driessen, K. (1999). A fast algorithm for the minimum covariance determinant estimator, Technometrics, 41, 212-223.