JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Optimal k-Nearest Neighborhood Classifier Using Genetic Algorithm
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Optimal k-Nearest Neighborhood Classifier Using Genetic Algorithm
Park, Chong-Sun; Huh, Kyun;
  PDF(new window)
 Abstract
Feature selection and feature weighting are useful techniques for improving the classification accuracy of k-Nearest Neighbor (k-NN) classifier. The main propose of feature selection and feature weighting is to reduce the number of features, by eliminating irrelevant and redundant features, while simultaneously maintaining or enhancing classification accuracy. In this paper, a novel hybrid approach is proposed for simultaneous feature selection, feature weighting and choice of k in k-NN classifier based on Genetic Algorithm. The results have indicated that the proposed algorithm is quite comparable with and superior to existing classifiers with or without feature selection and feature weighting capability.
 Keywords
k-Nearest Neighborhood classifier;genetic algorithm;feature selection;feature weighting;
 Language
Korean
 Cited by
1.
A Study of Travel Time Prediction using K-Nearest Neighborhood Method, Korean Journal of Applied Statistics, 2013, 26, 5, 835  crossref(new windwow)
 References
1.
Bao, Y., Du, X. and Ishii, N. (2002). Combining Feature Selection with Feature Weighting for k-NN Classifier, IDEAL 2002, LNCS 2412. Springer-Verlag, 461-468.

2.
Fung, G., Liu, J. and Lau, R. (1996). Feature selection in automatic signature verification based on genetic algorithms, In Proceedings of International Conference on Neural Information, 811-815.

3.
Kelly, J. and Davis, L. (1991). A hybrid genetic algorithm for classification, In Proceedings of the Twelfth International Joint Conference on Artificial Intelligence, 645-650.

4.
Komosinski, M. and Krawiec, K. (2000). Evolutionary weighting of image features for diagnosing of CNS tumors, Artificial Intelligence in Medicine, 19, 25-38. crossref(new window)

5.
Kudo, M. and Sklansky, J. (2000). Comparison of algorithms that select features for pattern classifier, Pattern Recognition, 33, 25-41. crossref(new window)

6.
Michie, D., Spiegelhalter, D. and Taylor, C. (1994). Machine Learning Neural and Statistical Classification, Ellis Horwood.

7.
Moser, A. and Murty, M. (2000). On the scalability of genetic algorithms to very large-scale feature selection, Real World Applications of Evolutionary Computing, 77-86.

8.
Paredes, R. and Vidal, E. (2000). A Class-Dependent Weighted Dissimilarity Measure for Nearest Neighbor Classification Problems, Pattern Recognition Letert, 21, 1027-1036. crossref(new window)

9.
Punch, W., Goodman, E. and Pei, M. (1993). Further research on feature selection and classification using genetic algorithms, In Proceedings of the Fifth International Conference on Genetic Algorithms, 379-383.

10.
Quinlan, J. (1993). C4.5: Programs for machine learning, Morgan Kaufmann.

11.
Raymer, M., Punch, W., Goodman, E., Kuhn, L. and Jain, A. (2000). Dimensionality reduction using genetic algorithms, IEEE Transactions on Computers, 4, 164-171. crossref(new window)

12.
Siedlecki, W. and Sklansky, J. (1990). A note on genetic algorithms large-scale feature selection, IEEE Transactions on Computers, 335-347.

13.
Smith, J., Fogarty, T. and Johnson, I. (1994). Genetic feature selection for clustering and classification, In Proceedings IEE Colloquium Genetic Algorithms in Image Processing Vision, 193-196.

14.
Tahir, M., Bouridane, A. and Kurugullu, F. (2007). Simultaneous feature selection and feature weighting using Hybrid Tabu Search/k-nearest neighbor classifier, Patter Recognition Letter, 28, 438-446. crossref(new window)

15.
Wettschereck, D., Aha, D. W. and Mohri, T. (1997). A review and empirical evaluation of feature weighting methods for a class of lazy learning algorithms, Artificial Intelligence Review, 11, 273-314. crossref(new window)