JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Object Classification Method Using Dynamic Random Forests and Genetic Optimization
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Object Classification Method Using Dynamic Random Forests and Genetic Optimization
Kim, Jae Hyup; Kim, Hun Ki; Jang, Kyung Hyun; Lee, Jong Min; Moon, Young Shik;
  PDF(new window)
 Abstract
In this paper, we proposed the object classification method using genetic and dynamic random forest consisting of optimal combination of unit tree. The random forest can ensure good generalization performance in combination of large amount of trees by assigning the randomization to the training samples and feature selection, etc. allocated to the decision tree as an ensemble classification model which combines with the unit decision tree based on the bagging. However, the random forest is composed of unit trees randomly, so it can show the excellent classification performance only when the sufficient amounts of trees are combined. There is no quantitative measurement method for the number of trees, and there is no choice but to repeat random tree structure continuously. The proposed algorithm is composed of random forest with a combination of optimal tree while maintaining the generalization performance of random forest. To achieve this, the problem of improving the classification performance was assigned to the optimization problem which found the optimal tree combination. For this end, the genetic algorithm methodology was applied. As a result of experiment, we had found out that the proposed algorithm could improve about 3~5% of classification performance in specific cases like common database and self infrared database compare with the existing random forest. In addition, we had shown that the optimal tree combination was decided at 55~60% level from the maximum trees.
 Keywords
Object Classification;Random Forest;Genetic Algorithm;Classifier Ensemble;
 Language
Korean
 Cited by
 References
1.
D. E. Rumelhart and J. L. McClelland, "Parallel distributed processing: explorations in the microstructure of cognition," MIT Press, 1986.

2.
C. J. C. Burges, "A tutorial on support vector machines for pattern recognition," Journal of Data Mining and Knowledge Discovery, Vol.2, pp.121-167, 1998. crossref(new window)

3.
H. Y. Yeom, J. H. Kim, and Y. S. Moon, "Gene Classification Method using Neural Networks and Membership Function," Journal of IEEK, Vol. 42CI, No. 4, pp.33-42, 2005.

4.
S. K. Kang, Y. U. Kim, I. M. So, and S. T. Jung, "Enhancement of the Correctness of Marker Detection and Marker Recognition based on Artificial Neural Networks," Journal of KSCI, Vol. 13, No. 1, pp. 89-97, Jan. 2008.

5.
Kwang Seong Kim and Doosung Hwang, "Support Vector Machine Algorithm for Imbalanced Data Learning," Journal of KSCI, Vol. 15, No. 7, pp. 11-17, July 2010.

6.
J. H. Kim, T. W. Cho, S. W. Chun, J. M. Lee, and Y. S. Moon, "Gunnery Classification Method Using Profile Feature Extraction in Infrared Images," Journal of KSCI, Vol. 19, No. 10, October 2014

7.
Robi Polikar, "Ensemble based systems in decision making," IEEE Circuit and Systems, Vol.6, No.3, pp.21-45, 2006.

8.
P. Viola and M. Jones, "Robust real-time face detection," Int. Journal of Computer Vision, Vol.57, No.2, pp.137-154, 2004. crossref(new window)

9.
L. Breiman, "Random forest," Machine Learning, Vol.45, pp.5-32, 2001. crossref(new window)

10.
J. H. Kim, K. H. Jang, J. H. Lee, and Y. S. Moon, "Multi-target Classification Method Based on Adaboost and Radial Basis Function," Journal of IEEK, Vol. 47 CI, No. 3, pp. 22-28, May 2010.

11.
K. Jung, J. Choi, and K. Jang, "Facial express recognition using registration and Adaboost," Journal of IEEK, Vol. 51, No. 11, pp.193-201, 2014.

12.
L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone," Chapman and Hall, 1993.

13.
R. Banfield, L. Hall, K. Bowyer, D. Bhadoria, W. P. Kegelmeyer, and S. Eschrich, "A comparison of ensemble creation technique," Proc. of Multiple Classifier Systems, Vol.1, pp.223-232, 2004.

14.
S. Bernard, L. Heutte, and S. Adam, "Influence of hyperparameters on random forest accuracy," Proc. of Workshop on Multiple Classifier Systems, Vol.1, pp.171-180, 2009.

15.
S. Bernard, L. Heutte, and S. Adam, "On the selection of decision trees in random forest," Proc. of Joint Conf. on Neural Networks, pp.302-307, 2009.

16.
E. Tripoli, D. Fotiadis, and G. Manis, "Dynamic construction of random forests: Evaluation using biomedical engineering problems," Proc. of IEEE Int. Conference on Information Technology and Application in Biomedicine, Vol.1, pp.3-5, 2010.

17.
S. Bernard, S. Adam, and L. Heutte, "Dynamic random forests," Journal of Pattern Recognition Letters, Vol.33, No.12, pp.1580-1586, 2012. crossref(new window)

18.
F. Roli, G. Giacinto, and G. Vernazza, " Methods for designing multiple classifier systems," Proc. of 2nd International Workshop MCS2001, pp.78-87, 2001.

19.
I. S. Oh, J. S. Lee, and B. R. Moon, "Hybrid genetic algorithms for feature selection," IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.26, No.11, pp.1424-1437, 2004. crossref(new window)

20.
K. S. Hu and I. S. Oh, "Genetic Algorithm for Node Pruning of Neural Networks," Journal of IEEK, Vol.46CI, No.2, pp.65-74, 2009.

21.
http://archive.ics.uci.edu/ml/

22.
C. M. Kim, Y. M. Baek, and H. Y. Kim, "An Efficient Pedestrian Recognition Method based on PCA Reconstruction and HOG Feature Descriptor," Journal of IEIE, Vol.50, No.10, pp.162-170, 2013.

23.
R. Maclin, "Boosting Classifiers Regionally," In Proc. of the 15th National Conference on Artificial Intelligence, Vol.1, pp.700-705, 1998.

24.
Venkatadri M. and Srinivasa R. K., "A multiobjective genetic algorithm for feature selection in data mining," Journal of Computer Science and Information Technology, Vol.1, No.5, pp.443-448, 2010.

25.
T. G. Dietterich, "An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization," Journal of Machine Learning, Vol. 40, No.2, pp.139-157, 2010.