• Title/Summary/Keyword: classifier ensemble

Search Result 111, Processing Time 0.03 seconds

Ensemble Learning for Underwater Target Classification (수중 표적 식별을 위한 앙상블 학습)

  • Seok, Jongwon
    • Journal of Korea Multimedia Society
    • /
    • v.18 no.11
    • /
    • pp.1261-1267
    • /
    • 2015
  • The problem of underwater target detection and classification has been attracted a substantial amount of attention and studied from many researchers for both military and non-military purposes. The difficulty is complicate due to various environmental conditions. In this paper, we study classifier ensemble methods for active sonar target classification to improve the classification performance. In general, classifier ensemble method is useful for classifiers whose variances relatively large such as decision trees and neural networks. Bagging, Random selection samples, Random subspace and Rotation forest are selected as classifier ensemble methods. Using the four ensemble methods based on 31 neural network classifiers, the classification tests were carried out and performances were compared.

Optimization of Random Subspace Ensemble for Bankruptcy Prediction (재무부실화 예측을 위한 랜덤 서브스페이스 앙상블 모형의 최적화)

  • Min, Sung-Hwan
    • Journal of Information Technology Services
    • /
    • v.14 no.4
    • /
    • pp.121-135
    • /
    • 2015
  • Ensemble classification is to utilize multiple classifiers instead of using a single classifier. Recently ensemble classifiers have attracted much attention in data mining community. Ensemble learning techniques has been proved to be very useful for improving the prediction accuracy. Bagging, boosting and random subspace are the most popular ensemble methods. In random subspace, each base classifier is trained on a randomly chosen feature subspace of the original feature space. The outputs of different base classifiers are aggregated together usually by a simple majority vote. In this study, we applied the random subspace method to the bankruptcy problem. Moreover, we proposed a method for optimizing the random subspace ensemble. The genetic algorithm was used to optimize classifier subset of random subspace ensemble for bankruptcy prediction. This paper applied the proposed genetic algorithm based random subspace ensemble model to the bankruptcy prediction problem using a real data set and compared it with other models. Experimental results showed the proposed model outperformed the other models.

Optimal Classifier Ensemble Design for Vehicle Detection Using GAVaPS (자동차 검출을 위한 GAVaPS를 이용한 최적 분류기 앙상블 설계)

  • Lee, Hee-Sung;Lee, Jae-Hung;Kim, Eun-Tai
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.1
    • /
    • pp.96-100
    • /
    • 2010
  • This paper proposes novel genetic design of optimal classifier ensemble for vehicle detection using Genetic Algorithm with Varying Population Size (GAVaPS). Recently, many classifiers are used in classifier ensemble to deal with tremendous amounts of data. However the problem has a exponential large search space due to the increasing the number of classifier pool. To solve this problem, we employ the GAVaPS which outperforms comparison with simple genetic algorithm (SGA). Experiments are performed to demonstrate the efficiency of the proposed method.

Coarse-to-fine Classifier Ensemble Selection using Clustering and Genetic Algorithms (군집화와 유전 알고리즘을 이용한 거친-섬세한 분류기 앙상블 선택)

  • Kim, Young-Won;Oh, Il-Seok
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.9
    • /
    • pp.857-868
    • /
    • 2007
  • The good classifier ensemble should have a high complementarity among classifiers in order to produce a high recognition rate and its size is small in order to be efficient. This paper proposes a classifier ensemble selection algorithm with coarse-to-fine stages. for the algorithm to be successful, the original classifier pool should be sufficiently diverse. This paper produces a large classifier pool by combining several different classification algorithms and lots of feature subsets. The aim of the coarse selection is to reduce the size of classifier pool with little sacrifice of recognition performance. The fine selection finds near-optimal ensemble using genetic algorithms. A hybrid genetic algorithm with improved searching capability is also proposed. The experimentation uses the worldwide handwritten numeral databases. The results showed that the proposed algorithm is superior to the conventional ones.

Performance Improvement of Classifier by Combining Disjunctive Normal Form features

  • Min, Hyeon-Gyu;Kang, Dong-Joong
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.10 no.4
    • /
    • pp.50-64
    • /
    • 2018
  • This paper describes a visual object detection approach utilizing ensemble based machine learning. Object detection methods employing 1D features have the benefit of fast calculation speed. However, for real image with complex background, detection accuracy and performance are degraded. In this paper, we propose an ensemble learning algorithm that combines a 1D feature classifier and 2D DNF (Disjunctive Normal Form) classifier to improve the object detection performance in a single input image. Also, to improve the computing efficiency and accuracy, we propose a feature selecting method to reduce the computing time and ensemble algorithm by combining the 1D features and 2D DNF features. In the verification experiments, we selected the Haar-like feature as the 1D image descriptor, and demonstrated the performance of the algorithm on a few datasets such as face and vehicle.

Detection for JPEG steganography based on evolutionary feature selection and classifier ensemble selection

  • Ma, Xiaofeng;Zhang, Yi;Song, Xiangfeng;Fan, Chao
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.11
    • /
    • pp.5592-5609
    • /
    • 2017
  • JPEG steganography detection is an active research topic in the field of information hiding due to the wide use of JPEG image in social network, image-sharing websites, and Internet communication, etc. In this paper, a new steganalysis method for content-adaptive JPEG steganography is proposed by integrating the evolutionary feature selection and classifier ensemble selection. First, the whole framework of the proposed steganalysis method is presented and then the characteristic of the proposed method is analyzed. Second, the feature selection method based on genetic algorithm is given and the implement process is described in detail. Third, the method of classifier ensemble selection is proposed based on Pareto evolutionary optimization. The experimental results indicate the proposed steganalysis method can achieve a competitive detection performance by compared with the state-of-the-art steganalysis methods when used for the detection of the latest content-adaptive JPEG steganography algorithms.

Hybrid Genetic Algorithm for Classifier Ensemble Selection (분류기 앙상블 선택을 위한 혼합 유전 알고리즘)

  • Kim, Young-Won;Oh, Il-Seok
    • The KIPS Transactions:PartB
    • /
    • v.14B no.5
    • /
    • pp.369-376
    • /
    • 2007
  • This paper proposes a hybrid genetic algorithm(HGA) for the classifier ensemble selection. HGA is added a local search operation for increasing the fine-turning of local area. This paper apply hybrid and simple genetic algorithms(SGA) to the classifier ensemble selection problem in order to show the superiority of HGA. And this paper propose two methods(SSO: Sequential Search Operations, CSO: Combinational Search Operations) of local search operation of hybrid genetic algorithm. Experimental results show that the HGA has better searching capability than SGA. The experiments show that the CSO considering the correlation among classifiers is better than the SSO.

Ensemble Classifier with Negatively Correlated Features for Cancer Classification (암 분류를 위한 음의 상관관계 특징을 이용한 앙상블 분류기)

  • 원홍희;조성배
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.12
    • /
    • pp.1124-1134
    • /
    • 2003
  • The development of microarray technology has supplied a large volume of data to many fields. In particular, it has been applied to prediction and diagnosis of cancer, so that it expectedly helps us to exactly predict and diagnose cancer. It is essential to efficiently analyze DNA microarray data because the amount of DNA microarray data is usually very large. Since accurate classification of cancer is very important issue for treatment of cancer, it is desirable to make a decision by combining the results of various expert classifiers rather than by depending on the result of only one classifier. Generally combining classifiers gives high performance and high confidence. In spite of many advantages of ensemble classifiers, ensemble with mutually error-correlated classifiers has a limit in the performance. In this paper, we propose the ensemble of neural network classifiers learned from negatively correlated features using three benchmark datasets to precisely classify cancer, and systematically evaluate the performances of the proposed method. Experimental results show that the ensemble classifier with negatively correlated features produces the best recognition rate on the three benchmark datasets.

Effective Korean sentiment classification method using word2vec and ensemble classifier (Word2vec과 앙상블 분류기를 사용한 효율적 한국어 감성 분류 방안)

  • Park, Sung Soo;Lee, Kun Chang
    • Journal of Digital Contents Society
    • /
    • v.19 no.1
    • /
    • pp.133-140
    • /
    • 2018
  • Accurate sentiment classification is an important research topic in sentiment analysis. This study suggests an efficient classification method of Korean sentiment using word2vec and ensemble methods which have been recently studied variously. For the 200,000 Korean movie review texts, we generate a POS-based BOW feature and a feature using word2vec, and integrated features of two feature representation. We used a single classifier of Logistic Regression, Decision Tree, Naive Bayes, and Support Vector Machine and an ensemble classifier of Adaptive Boost, Bagging, Gradient Boosting, and Random Forest for sentiment classification. As a result of this study, the integrated feature representation composed of BOW feature including adjective and adverb and word2vec feature showed the highest sentiment classification accuracy. Empirical results show that SVM, a single classifier, has the highest performance but ensemble classifiers show similar or slightly lower performance than the single classifier.

Split Effect in Ensemble

  • Chung, Dong-Jun;Kim, Hyun-Joong
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2005.11a
    • /
    • pp.193-197
    • /
    • 2005
  • Classification tree is one of the most suitable base learners for ensemble. For past decade, it was found that bagging gives the most accurate prediction when used with unpruned tree and boosting with stump. Researchers have tried to understand the relationship between the size of trees and the accuracy of ensemble. With experiment, it is found that large trees make boosting overfit the dataset and stumps help avoid it. It means that the accuracy of each classifier needs to be sacrificed for better weighting at each iteration. Hence, split effect in boosting can be explained with the trade-off between the accuracy of each classifier and better weighting on the misclassified points. In bagging, combining larger trees give more accurate prediction because bagging does not have such trade-off, thus it is advisable to make each classifier as accurate as possible.

  • PDF