JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Pruning the Boosting Ensemble of Decision Trees
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Pruning the Boosting Ensemble of Decision Trees
Yoon, Young-Joo; Song, Moon-Sup;
  PDF(new window)
 Abstract
We propose to use variable selection methods based on penalized regression for pruning decision tree ensembles. Pruning methods based on LASSO and SCAD are compared with the cluster pruning method. Comparative studies are performed on some artificial datasets and real datasets. According to the results of comparative studies, the proposed methods based on penalized regression reduce the size of boosting ensembles without decreasing accuracy significantly and have better performance than the cluster pruning method. In terms of classification noise, the proposed pruning methods can mitigate the weakness of AdaBoost to some degree.
 Keywords
AdaBoost;Penalized regression;Cluster pruning;LASSO;SCAD;Pruning ensemble;
 Language
English
 Cited by
 References
1.
Breiman, L. (1996). Bagging predictors. Machine Learning, Vol. 24, 123-140

2.
Breiman, L. (1998). Arcing classifiers (with discussion). Annals of Statistics, Vol. 26, 801-849 crossref(new window)

3.
Breiman, L., Friedman, J., Olshen, R. and Stone, C. (1984). Classification and Regression Trees, Chapman and Hall, New York

4.
Dietterich, T.G. (2000). An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Machine Learning, Vol. 40, 139-157 crossref(new window)

5.
Fan, J, and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association, Vol. 96, 1348-1360 crossref(new window)

6.
Freund, Y. and Schapire, R. E. (1997). A decision-theoretic generalization of online learning and application to boosting. Journal of Computer and System Science, Vol. 55, 119-139 crossref(new window)

7.
Friedman, J. (2001). Greedy function approximation: a gradient boosting machine. Annals of Statistics, Vol. 29, 1189-1232

8.
Hastie, T., Tibshirani, R. and Friedman, J.H. (2001). Elements of Statistical Learning. Springer-Verlag, New York

9.
Heskes, T. (1997). Balancing between bagging and bumping, In Mozer, M., Jordan, M., and Petsche, T. editors. Advances in Neural Information Processing, Morgan Kaufmann

10.
Lazarevic, A. and Obradovic, Z. (2001). The effective pruning of neural network ensembles. Proceedings of 2001 IEEE/INNS International Joint Conierence on Neural Networks, 796-801

11.
Margineantu, D.D. and Dietterich, T.G. (1997). Pruning adaptive boosting. Proceedings of the 14th International Conference in Machine Learning, 211-218

12.
Mason, L., Baxter, J., Bartlett, P.L. and Frean, M. (2000). Functional gradient techniques for combining hypotheses, In A. J. Smola, P. L. Bartlett, B. Scholkopf and D. Schuurmans, editors. Advances in Large Margin Classifiers, Cambridge: MIT press

13.
Merz, C.J. and Murphy, P.M. (1998). DCI Repository of Machine Learning database. Available at http://www.ics.uci.edu/-mlearn/MLRepository.html

14.
Quinlan, J.R. (1993). C4.5 : Programs for Machine Learning, Morgan Kaufmann, San Maeto, CA

15.
Quinlan, J.R. (1996). Bagging, boosting, and C4.5. Proceeding of 13th National Conference on Artificial Intelligence, 725-730

16.
Rosset, S., Zhu, J. and Hastie, T. (2004). Boosting as a regularized path to a maximum margin classifier. Journal of Machine Learning Research, Vol. 5, 941-973

17.
Tamon, C. and Xiang, J. (2000). On the boosting pruning problem. Proceedings of 11th European Conference on Machine Learning, Lecture Notes in Computer Science, Vol. 1810, 404-412

18.
Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of Royal Statistical Society B, Vol. 58, 267-288

19.
Tibshirani, R. and Knight, K. (1999). Model selection and inference by bootstrap 'bumping'. Journal of Computational and Graphical Statistics, Vol. 8, 671-686 crossref(new window)