• Title/Summary/Keyword: bagging

Search Result 173, Processing Time 0.079 seconds

Effect of Grape-Bagging Paper Properties on Saccharinity of Grape (포도 당도에 영향하는 포도 재배용 봉지의 특성 효과)

  • 이장호;박종문;이진호;유병철
    • Journal of Korea Technical Association of The Pulp and Paper Industry
    • /
    • v.33 no.3
    • /
    • pp.52-58
    • /
    • 2001
  • The aim of using grape-bagging paper is preventing damages by light and harmful insects during grape growth. The number of using grape-bagging paper has been increasing because advantages of using it have been confirmed. A technology to produce it has not been fully developed yet. In this study properties of the grape-bagging paper were analyzed. Results showed that air-permeability and transmitted light of grape-bagging paper were important. It was tried to see the influence of paper structure on air-permeability, transmitted light and the grapes saccharinity. For making different structure of grape-bagging paper, papers were produced with different freeness levels at several pressure conditions. Coloration time of Campbell grape with new bagging paper started about 5 days earlier than that with usual bagging paper, It was also possible to improve the saccharinity about 0.1-N0.8 Brix. Because new bagging paper has a low apparent density, it affected the saccharinity and coloration time of grape.

  • PDF

SOHO Bankruptcy Prediction Using Modified Bagging Predictors (Modified Bagging Predictors를 이용한 SOHO 부도 예측)

  • Kim Seung-Hyeok;Kim Jong-U
    • Proceedings of the Korea Inteligent Information System Society Conference
    • /
    • /
    • pp.176-182
    • /
    • 2006
  • 본 연구에서는 기존 Bagging Predictors에 수정을 가한 Modified Bagging Predictors를 이용하여 SOHO 에 대한 부도예측 모델을 제시한다. 대기업 및 중소기업에 대한 기압부도예측 모델에 대한 많은 선행 연구가 있어왔지만 SOHO 만의 기업부도예측 모델에 관한 연구는 미비한 상태이다. 금융기관들의 대출심사시 대기업 및 중소기업과는 달리 SOHO에 대한 대출심사는 이직은 체계화되지 못한 채 신용정보점수 등의 단편적인 요소를 사용하고 있는 것에 현실이고 이에 따라 잘못된 대출로 안한 금융기관의 부실화를 초래할 위험성이 크다. 본 연구에서는 실제 국내은행의 SOHO 데이터 집합이 사용되었다. 먼저 기업부도 예측 모델에서 우수하다고 연구되어진 인공신경망과 의사결정나무 추론 기법을 적용하여 보았지만 만족할 만한 성과를 이쓸어내지 못하여, 기존 기업부도예측 모델연구에서 적용이 미비하였던 Bagging Predictors와 이를 개선한 Modified Bagging Predictors를 제시하고 이를 적용하여 보았다. 연구결과,; SOHO 부도예측에 있어서 본 연구에서 제시한 Modified Bagging Predictors 가 인공신경망과 Bagging Predictors등의 기존 기법에 비해서 성과가 향상됨을 알 수 있었다. 제시된 Modified Bagging Predictors의 유용성을 확인하기 위해서 추가적으로 대수의 공개 데이터 집합을 활용하여 성능을 비교한 결과 Modified Bagging Predictors 가 기존의 Bagging Predictors 에 비해 일관적으로 성과가 향상됨을 알 수 있었다.

  • PDF

SOHO Bankruptcy Prediction Using Modified Bagging Predictors (Modified Bagging Predictors를 이용한 SOHO 부도 예측)

  • Kim, Seung-Hyuk;Kim, Jong-Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.13 no.2
    • /
    • pp.15-26
    • /
    • 2007
  • In this study, a SOHO (Small Office Home Office) bankruptcy prediction model is proposed using Modified Bagging Predictors which is modification of traditional Bagging Predictors. There have been several studies on bankruptcy prediction for large and middle size companies. However, little studies have been done for SOHOs. In commercial banks, loan approval processes for SOHOs are usually less structured than those for large and middle size companies, and largely depend on partial information such as credit scores. In this study, we use a real SOHO loan approval data set of a Korean bank. First, decision tree induction techniques and artificial neural networks are applied to the data set, and the results are not satisfactory. Bagging Predictors which has been not previously applied for bankruptcy prediction and Modified Bagging Predictors which is proposed in this paper are applied to the data set. The experimental results show that Modified Bagging Predictors provides better performance than decision tree inductions techniques, artificial neural networks, and Bagging Predictors.

  • PDF

Comparison of ensemble pruning methods using Lasso-bagging and WAVE-bagging (분류 앙상블 모형에서 Lasso-bagging과 WAVE-bagging 가지치기 방법의 성능비교)

  • Kwak, Seungwoo;Kim, Hyunjoong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.6
    • /
    • pp.1371-1383
    • /
    • 2014
  • Classification ensemble technique is a method to combine diverse classifiers to enhance the accuracy of the classification. It is known that an ensemble method is successful when the classifiers that participate in the ensemble are accurate and diverse. However, it is common that an ensemble includes less accurate and similar classifiers as well as accurate and diverse ones. Ensemble pruning method is developed to construct an ensemble of classifiers by choosing accurate and diverse classifiers only. In this article, we proposed an ensemble pruning method called WAVE-bagging. We also compared the results of WAVE-bagging with that of the existing pruning method called Lasso-bagging. We showed that WAVE-bagging method performed better than Lasso-bagging by the extensive empirical comparison using 26 real dataset.

Randomized Bagging for Bankruptcy Prediction (랜덤화 배깅을 이용한 재무 부실화 예측)

  • Min, Sung-Hwan
    • Journal of Information Technology Services
    • /
    • v.15 no.1
    • /
    • pp.153-166
    • /
    • 2016
  • Ensemble classification is an approach that combines individually trained classifiers in order to improve prediction accuracy over individual classifiers. Ensemble techniques have been shown to be very effective in improving the generalization ability of the classifier. But base classifiers need to be as accurate and diverse as possible in order to enhance the generalization abilities of an ensemble model. Bagging is one of the most popular ensemble methods. In bagging, the different training data subsets are randomly drawn with replacement from the original training dataset. Base classifiers are trained on the different bootstrap samples. In this study we proposed a new bagging variant ensemble model, Randomized Bagging (RBagging) for improving the standard bagging ensemble model. The proposed model was applied to the bankruptcy prediction problem using a real data set and the results were compared with those of the other models. The experimental results showed that the proposed model outperformed the standard bagging model.

A Study on Bagging Neural Network for Predicting Defect Size of Steam Generator Tube in Nuclear Power Plant (원전 증기발생기 세관 결함 크기 예측을 위한 Bagging 신경회로망에 관한 연구)

  • Kim, Kyung-Jin;Jo, Nam-Hoon
    • Journal of the Korean Society for Nondestructive Testing
    • /
    • v.30 no.4
    • /
    • pp.302-310
    • /
    • 2010
  • In this paper, we studied Bagging neural network for predicting defect size of steam generator(SG) tube in nuclear power plant. Bagging is a method for creating an ensemble of estimator based on bootstrap sampling. For predicting defect size of SG tube, we first generated eddy current testing signals for 4 defect patterns of SG tube with various widths and depths. Then, we constructed single neural network(SNN) and Bagging neural network(BNN) to estimate width and depth of each defect. The estimation performance of SNN and BNN were measured by means of peak error. According to our experiment result, average peak error of SNN and BNN for estimating defect depth were 0.117 and 0.089mm, respectively. Also, in the case of estimating defect width, average peak error of SNN and BNN were 0.494 and 0.306mm, respectively. This shows that the estimation performance of BNN is superior to that of SNN.

Classification Performance Improvement of Steam Generator Tube Defects in Nuclear Power Plant Using Bagging Method (Bagging 방법을 이용한 원전SG 세관 결함패턴 분류성능 향상기법)

  • Lee, Jun-Po;Jo, Nam-Hoon
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.58 no.12
    • /
    • pp.2532-2537
    • /
    • 2009
  • For defect characterization in steam generator tubes in nuclear power plant, artificial neural network has been extensively used to classify defect types. In this paper, we study the effectiveness of Bagging for improving the performance of neural network for the classification of tube defects. Bagging is a method that combines outputs of many neural networks that were trained separately with different training data set. By varying the number of neurons in the hidden layer, we carry out computer simulations in order to compare the classification performance of bagging neural network and single neural network. From the experiments, we found that the performance of bagging neural network is superior to the average performance of single neural network in most cases.

Preperation and properties of embossing treated fruit bagging paper (Embossing 처리 과대지의 제조 및 물성)

  • Kim, Kang-Jae;Byeon, Jong-Sang;Kim, Dae-Keun;Eom, Tae-Jin
    • Proceedings of the Korea Technical Association of the Pulp and Paper Industry Conference
    • /
    • /
    • pp.201-206
    • /
    • 2006
  • Embossing treated fruit bagging paper was prepared with newly designed embossing roll and mechanical properties of fruit bagging paper were evaluated. The embossing technology was developed. First of all, embossing roll was designed in Ginyong Embo(Co) for the embossing process of fruiting bag. The embossing treated fruit bagging paper was manufactured in Agro(Co) at plant scale. The mechanical properties of embossing paper was investigated and operation efficiency of bagging was tested at field. The properties of embossed paper was quit satisfied for fruiting bag for cultivation of apple and pears.

  • PDF

Developing an Ensemble Classifier for Bankruptcy Prediction (부도 예측을 위한 앙상블 분류기 개발)

  • Min, Sung-Hwan
    • Journal of the Korea Industrial Information Systems Research
    • /
    • v.17 no.7
    • /
    • pp.139-148
    • /
    • 2012
  • An ensemble of classifiers is to employ a set of individually trained classifiers and combine their predictions. It has been found that in most cases the ensembles produce more accurate predictions than the base classifiers. Combining outputs from multiple classifiers, known as ensemble learning, is one of the standard and most important techniques for improving classification accuracy in machine learning. An ensemble of classifiers is efficient only if the individual classifiers make decisions as diverse as possible. Bagging is the most popular method of ensemble learning to generate a diverse set of classifiers. Diversity in bagging is obtained by using different training sets. The different training data subsets are randomly drawn with replacement from the entire training dataset. The random subspace method is an ensemble construction technique using different attribute subsets. In the random subspace, the training dataset is also modified as in bagging. However, this modification is performed in the feature space. Bagging and random subspace are quite well known and popular ensemble algorithms. However, few studies have dealt with the integration of bagging and random subspace using SVM Classifiers, though there is a great potential for useful applications in this area. The focus of this paper is to propose methods for improving SVM performance using hybrid ensemble strategy for bankruptcy prediction. This paper applies the proposed ensemble model to the bankruptcy prediction problem using a real data set from Korean companies.

Comparing Classification Accuracy of Ensemble and Clustering Algorithms Based on Taguchi Design (다구찌 디자인을 이용한 앙상블 및 군집분석 분류 성능 비교)

  • Shin, Hyung-Won;Sohn, So-Young
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.27 no.1
    • /
    • pp.47-53
    • /
    • 2001
  • In this paper, we compare the classification performances of both ensemble and clustering algorithms (Data Bagging, Variable Selection Bagging, Parameter Combining, Clustering) to logistic regression in consideration of various characteristics of input data. Four factors used to simulate the logistic model are (1) correlation among input variables (2) variance of observation (3) training data size and (4) input-output function. In view of the unknown relationship between input and output function, we use a Taguchi design to improve the practicality of our study results by letting it as a noise factor. Experimental study results indicate the following: When the level of the variance is medium, Bagging & Parameter Combining performs worse than Logistic Regression, Variable Selection Bagging and Clustering. However, classification performances of Logistic Regression, Variable Selection Bagging, Bagging and Clustering are not significantly different when the variance of input data is either small or large. When there is strong correlation in input variables, Variable Selection Bagging outperforms both Logistic Regression and Parameter combining. In general, Parameter Combining algorithm appears to be the worst at our disappointment.

  • PDF