특징공간을 사선 분할하는 퍼지 결정트리 유도

Fuaay Decision Tree Induction to Obliquely Partitioning a Feature Space

  • 발행 : 2002.04.01

초록

결정트리 생성은 특징값들로 기술된 사례들로부터 분류 규칙을 추출하는 유용한 기계학습 방법중 하나이다. 결정트리는 특징공간을 분할하는 형태에 따라 단변수(univariate) 결정트리와 다변수(multivariate) 결정트리로 대별된다. 실제 현장에서 얻어지는 데이터는 관측오류, 불확실성, 주관적인 판단 등의 이유로 특징값 자체에 오류를 포함하는 경우가 많다. 이러한 오류에 대해 강건한 결정트리를 생성하기 위한 방법으로 퍼지 기법을 도입한 결정트리 생성 방법에 대한 연구가 진행되어 왔다. 현재까지 대부분의 퍼지 결정트리에 대한 연구는 단변수 결정트리에 퍼지 기법을 도입한 것들이며, 다변수 결정트리에 퍼지 기법을 적용한 것은 찾아보기 힘들다. 이 논문에서는 다변수 결정트리에 퍼지 기법을 적용하여 퍼지사선형 결정트리라고 하는 퍼지 결정트리를 생성하는 방법을 제안한다. 또한 제안한 결정트리 생성 방법의 특성을 보이기 위한 실험 결과를 보인다.

Decision tree induction is a kind of useful machine learning approach for extracting classification rules from a set of feature-based examples. According to the partitioning style of the feature space, decision trees are categorized into univariate decision trees and multivariate decision trees. Due to observation error, uncertainty, subjective judgment, and so on, real-world data are prone to contain some errors in their feature values. For the purpose of making decision trees robust against such errors, there have been various trials to incorporate fuzzy techniques into decision tree construction. Several researches hove been done on incorporating fuzzy techniques into univariate decision trees. However, for multivariate decision trees, few research has been done in the line of such study. This paper proposes a fuzzy decision tree induction method that builds fuzzy multivariate decision trees named fuzzy oblique decision trees, To show the effectiveness of the proposed method, it also presents some experimental results.

키워드

참고문헌

  1. C. E. Brodley, P. E. Utgoff, Multivariate versus univariate decision trees, Technical Report 92-8, Dep. of Computer Science, Univ. of Massachusetts, Amherst, MA, 1992
  2. Y. Yuan, M. J. Shaw, Induction 0:' fuzzy decision trees, International Journal for Fuzzy Sets and Systems, Vol. 69, pp.125-139, 1995 https://doi.org/10.1016/0165-0114(94)00229-Z
  3. S. K Murthy, S. Kasif, S. Salzberg, A System for Induction of Oblique Decision Trees, Journal of Artificial Intelligence Research Vol. 2, pp.1-33, 1994
  4. J.R. Quinlan, C4.5 : Programs for Machine Learning, Morgan Kaufmann, San Mateo, CA. 1993
  5. G. J. Klir, T. A. Folger, Fuzzy Sets, Uncertainty, and Information, Prentice-Hall, 1992
  6. C. E. Bradley, P. E. Utgoff, Multivariate decision trees, Machine Learning, Vol.19, pp.45-77, 1995
  7. C. Z. Janikow, Fuzzy Decision Trees: Issues and Methods, IEEE Transactions on Systems, Man and Cybernetics. Part B: Cybernetics, Vol.28, No. 1, pp. 1-14, 1998 https://doi.org/10.1109/3477.658573
  8. L. Breiman, J. Friedman, R. Olshen, C. Stone, Classification and Regression Trees, Wadsworth International Group, 1984
  9. J.R. Quinlan, Induction trees at probabilistic classifiers, Proc. 4th Int. Workshop on Machine Learning, pp.31-37, 1987
  10. J. Zeidler, M. Schlosser, Continuous-Valued Attributes in Fuzzy Decision Trees, Proc. of Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU'96), Granada, Spain, Volume I, pp. 395-400. 1996
  11. J. R. Quinlan, Induction on Decision Trees, Machine Learning, Vol.1, 1986, pp.81-106 https://doi.org/10.1023/A:1022643204877
  12. L. A. Breslow, D. W. Aha, Simplifying decision trees: A Survey, NCARAI Technical Report No. AIC-96-014, Naval Research Laboratory, 1996
  13. T. M. Mitchell, Machine Learning, The McGraw-Hill Co.,414p, 1997
  14. X. Wang, J. Hong, Learning optimization in simplifying fuzzy rules, Fuzzy Sets and Systems, Vol.106, pp.349-356, 1999 https://doi.org/10.1016/S0165-0114(97)00300-X
  15. K-M. Lee, KM. Lee, J.-H. Lee, H. Lee-Kwang, A Fuzzy Decision Tree Induction Method for Fuzzy Data, Proc. of Int. Conf. on FUZZ-IEEE, Seoul, Korea, pp.16-21, 199 https://doi.org/10.1109/FUZZY.1999.793199
  16. H. Kim, L. Fu, Generalization and Fault Tolerance in Rule-based Neural Networks, Proc. of 1994 IEEE Conf. on Neural Networks, Vol.3, pp.1550-1555, 1994 https://doi.org/10.1109/ICNN.1994.374386
  17. y. -H. Pao, Adaptive Pattern Recognition and Neural Networks, Addison-Wesley, 1989