Dimensionality reduction for pattern recognition based on difference of distribution among classes

  • Nishimura, Masaomi (Graduate School of Science and Engineering, Saitama University) ;
  • Hiraoka, Kazuyuki (Department of Information and Computer Sciences, Saitama University) ;
  • Mishima, Taketoshi (Department of Information and Computer Sciences, Saitama University)
  • 발행 : 2002.07.01

초록

For pattern recognition on high-dimensional data, such as images, the dimensionality reduction as a preprocessing is effective. By dimensionality reduction, we can (1) reduce storage capacity or amount of calculation, and (2) avoid "the curse of dimensionality" and improve classification performance. Popular tools for dimensionality reduction are Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), and Independent Component Analysis (ICA) recently. Among them, only LDA takes the class labels into consideration. Nevertheless, it, has been reported that, the classification performance with ICA is better than that with LDA because LDA has restriction on the number of dimensions after reduction. To overcome this dilemma, we propose a new dimensionality reduction technique based on an information theoretic measure for difference of distribution. It takes the class labels into consideration and still it does not, have restriction on number of dimensions after reduction. Improvement of classification performance has been confirmed experimentally.

키워드