DOI QR코드

DOI QR Code

A New Ensemble System using Dynamic Weighting Method

동적 중요도 결정 방법을 이용한 새로운 앙상블 시스템

  • 서동훈 (충남대학교 대학원) ;
  • 이원돈 (충남대학교 전기정보통신공학부)
  • Received : 2011.04.27
  • Accepted : 2011.05.31
  • Published : 2011.06.30

Abstract

In this paper, a new ensemble system using dynamic weighting method with added weight information into classifiers is proposed. The weights used in the traditional ensemble system are those after the training phase. Once extracted, the weights in the traditional ensemble system remain fixed regardless of the test data set. One way to circumvent this problem in the gating networks is to update the weights dynamically by adding processes making architectural hierarchies, but it has the drawback of added processes. A simple method to update weights dynamically, without added processes, is proposed, which can be applied to the already established ensemble system without much of the architectural modification. Experiment shows that this method performs better than AdaBoost.

본 논문에서는 분류자들 속에 중요도 정보를 삽입하여 동적 중요도 결정이 가능한 앙상블 시스템을 제안하였다. 그동안 앙상블 시스템에서 중요도는 훈련이 끝나고 결정된 중요도를 사용하였다. 한 번 결정된 중요도는 테스트 데이터에 상관없이 정적으로 사용되었다. 이 문제를 푸는 방법으로 관문 네트워크에서 구조적으로 계층을 두는 프로세스를 추가하여 동적 중요도 결정이 가능하게 하는 방법이 있지만 프로세스가 추가된다는 단점이 있다. 본 논문에서는 이런 추가적인 프로세스 없이 간단하게 동적 중요도 결정이 가능한 방법을 보여주고 구조적 변경 없이 기존의 시스템에 쉽게 적용할 수 있으며 AdaBoost보다 나은 성능을 보여주는 알고리즘을 제안한다.

Keywords

References

  1. L.K Hansen and P.Salamon, "Neural network ensembles," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 12, no. 10, pp993-1001, 1990. https://doi.org/10.1109/34.58871
  2. R.E Schapire, "The strength of weak learnability," Machine Learning, vol.5, no. 2, pp. 197-227, 1990.
  3. Yoav Freund and Robert E. Schapire. "A decisiontheoretic generalization of on-line learning and an application to boosting," Journal of Computer and System Sciences, 55(1):119-139, August 1997. https://doi.org/10.1006/jcss.1997.1504
  4. Y. Freund and R. Schapire. "A Short Introduction to Boosting," Journal of Japanese Society for Artificial Intelligence, 14(5):771-780, 1999.
  5. K. Woods, W.P.J. Kegelmeyer, and K. Bowyer, "Combination of multiple classifiers using local accuracy estimates," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 4, pp. 405-410, 1997. https://doi.org/10.1109/34.588027
  6. I. Bloch, "Information combination operators for data fusion: A comparative review with classification," IEEE Transactions on Systems, Man, and Cybernetics Part A: Systems and Humans, vol. 26, no. 1, pp. 52-67, 1996. https://doi.org/10.1109/3468.477860
  7. R. Battiti and A.M. Colla, "Democracy in neural nets: Voting schemes for classification," Neural Networks, vol. 7, no. 4, pp. 691-707, 1994. https://doi.org/10.1016/0893-6080(94)90046-9
  8. L. Xu, A. Krzyzak, and C.Y. Suen, "Methods of combining multiple classifiers and their applications to handwriting recognition," IEEE Transactions on Systems, Man and Cybernetics, vol. 22, no. 3, pp. 418-435, 1992. https://doi.org/10.1109/21.155943
  9. T.K. Ho, J.J. Hull, and S.N. Srihari, "Decision combination in multiple classifier systems," IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 16, no. 1, pp. 66-75, 1994. https://doi.org/10.1109/34.273716
  10. G. Rogova, "Combining the results of several neural network classifiers," Neural Networks, vol. 7, no. 5, pp. 777-781, 1994. https://doi.org/10.1016/0893-6080(94)90099-X
  11. L. Lam and C.Y. Suen, "Optimal combinations of pattern classifiers," Pattern Recognition Letters, vol. 16, no. 9, pp. 945-954, 1995. https://doi.org/10.1016/0167-8655(95)00050-Q
  12. H. Drucker, C. Cortes, L.D. Jackel, Y. LeCun, and V. Vapnik, "Boosting and other ensemble methods," Neural Computation, vol. 6, no. 6, pp. 1289-1301, 1994. https://doi.org/10.1162/neco.1994.6.6.1289
  13. L.I. Kuncheva, "Classifier ensembles for changing environments," 5th Int. Workshop on Multiple Classifier Systems, Lecture Notes in Computer Science, F. Roli, J. Kittler, and T. Windeatt, Eds., vol. 3077, pp. 1-15, 2004.
  14. R.A. Jacobs, M.I. Jordan, S.J. Nowlan, and G.E. Hinton, "Adaptive mixtures of local experts," Neural Computation, vol. 3, pp. 79-87, 1991. https://doi.org/10.1162/neco.1991.3.1.79
  15. M.J. Jordan and R.A. Jacobs, "Hierarchical mixtures of experts and the EM algorithm," Neural Computation, vol. 6, no. 2, pp. 181-214, 1994. https://doi.org/10.1162/neco.1994.6.2.181
  16. R.E Schapire, "The strength of weak learnability," Machine Learning, vol. 5, no. 2, pp. 197-227, 1990
  17. L. Breiman, "Bagging predictors," Machine Learning, vol. 24, no. 2, pp. 123-140, 1996
  18. C.J. Merz and P. M. Murphy. UCI repository of machine learning databases, 1998. www.ics.uci.edu/-mlearn/MLRepository.html