A Study on Automatic Learning of Weight Decay Neural Network

가중치감소 신경망의 자동학습에 관한 연구

  • Hwang, Chang-Ha (Department of Statistical Information, Catholic University of Taegu) ;
  • Na, Eun-Young (Graduate School, Catholic University of Taegu) ;
  • Seok, Kyung-Ha (Department of Data Science, Inje University)
  • 황창하 (대구가톨릭대학교 정보통계학과) ;
  • 나은영 (대구가톨릭대학교 전산통계학과) ;
  • 석경하 (인제대학교 데이터정보학과)
  • Published : 2001.10.30

Abstract

Neural networks we increasingly being seen as an addition to the statistics toolkit which should be considered alongside both classical and modern statistical methods. Neural networks are usually useful for classification and function estimation. In this paper we concentrate on function estimation using neural networks with weight decay factor The use of weight decay seems both to help the optimization process and to avoid overfitting. In this type of neural networks, the problem to decide the number of hidden nodes, weight decay parameter and iteration number of learning is very important. It is called the optimization of weight decay neural networks. In this paper we propose a automatic optimization based on genetic algorithms. Moreover, we compare the weight decay neural network automatically learned according to automatic optimization with ordinary neural network, projection pursuit regression and support vector machines.

신경망은 점차 분류 및 함수추정을 위한 현대 통계적 방법론으로 부각되고 있다. 신경망은 특히 선형 회귀함수를 일반화시키는 유연한(flexible) 방법을 제공하며 일반적 비선형 함수를 모수화하는 방법으로 간주된다. 본 논문에서는 함수추정을 위한 신경망을 생각한다. 신경망이 훈련자료를 과대적합하는 것을 피할 수 있도록 하는 간단한 방법은 정칙화(regularization)이다. 신경망에서는 정칙화를 위해 주로 가중치 감소법(weight decay method)을 사용한다. 함수추정을 위해 가중치감소 신경망을 사용할 때 은닉노드수, 가중치모수, 학습률 및 학습반복회수가 중요한 모수이다. 본 논문에서는 유전자 알고리즘을 사용하여 가중치감소 신경망의 중요한 모수들을 자동으로 최적화하는 방법을 제안하고 결과적으로 가중치감소 신경망을 자동학습하는 방법을 설명한다. 그리고 다른 함수추정방법들과 자동학습된 가중치감소 신경망을 비교분석한다.

Keywords

References

  1. 유전자 알고리즘 공성곤;김인택;박대희;박주영;신요한
  2. Neural Networks for Pattern Recognition Bishop, C. M.
  3. Advances in Neural Information Processing Systems v.12 Model Selection for Support Vector Machnies Chapelle, O.;Vapnik, V. N.
  4. Optimization and Machnie Leaining Genetic Algotithms in Search David, E. G.
  5. ISIS Technical Report, U. of Southampton Support Vector Machines for Classificaton and Regression Gunn, S.
  6. IEEE Transactions on Neural Networks v.5 Regression Modeling in Back-Propagation and Projection Pursuit Learning Hwang, J-N;Lay, S-R;Maechler, M.;Martin, D.;Schimert, J.
  7. Fifth Brazilian Symposium on Neural Networks Optimising the Widths of Radial Basis Functions Orr, M. J. L.
  8. Pattern Recognition and Neural Networks Ripley, B. D.
  9. Artificial Neural Networks: Prospects for Medicine Neural Networks as Statistical Methods in Survival Analysis Ripley, B. D.;Ripley, R. M.;R. Dybowski(ed.);V. Gant(ed.)
  10. Journal of Computational and Graphical Statistics v.3 Automatic Smoothing Spline Projection Pursuit Roosen, C. B.;Hastie, T. J.
  11. NeuroCOLt2 Technical Report, NeuroCOLT A Tutorial on Support Vector Regression Smola, A. J.;Scholkopf, B.
  12. Communications in Statistics v.4 A Completely Automatic French Curve Wahba, G.;Wold, S.
  13. Proceedings of the IEEE International Conference on Neural Networks Backpropagation: Past and future Werbos, P.