DOI QR코드

DOI QR Code

Optimization of Sigmoid Activation Function Parameters using Genetic Algorithms and Pattern Recognition Analysis in Input Space of Two Spirals Problem

유전자알고리즘을 이용한 시그모이드 활성화 함수 파라미터의 최적화와 이중나선 문제의 입력공간 패턴인식 분석

  • Received : 2009.12.07
  • Accepted : 2010.03.18
  • Published : 2010.04.28

Abstract

This paper presents a optimization of sigmoid activation function parameter using genetic algorithms and pattern recognition analysis in input space of two spirals benchmark problem. To experiment, cascade correlation learning algorithm is used. In the first experiment, normal sigmoid activation function is used to analyze the pattern classification in input space of the two spirals problem. In the second experiment, sigmoid activation functions using different fixed values of the parameters are composed of 8 pools. In the third experiment, displacement of the sigmoid function to determine the value of the three parameters is obtained using genetic algorithms. The parameter values applied to the sigmoid activation functions for candidate neurons are used. To evaluate the performance of these algorithms, each step of the training input pattern classification shows the shape of the two spirals.

Keywords

Cascade Correlation Algorithm;Activation Function;Sigmoid Function;Two Spirals Problem

References

  1. F. Dandurand, V. Berthiaume, and T. R. Shultz, "A systematic comparison of flat and standard cascade-correlation using a student-teacher network approximation task," Connection Science, Vol.19, No.3, pp.223-244, 2007. https://doi.org/10.1080/09540090701528951
  2. S. E. Fahlman and C. Lebiere, "The cascadecorrelation learning architecture," Advances in Neural Information Processing Systems 2, Morgan Kaufmann, 1990.
  3. S. E. Fahlman, "The Recurrent Cascade-Correlation Architecture," Advances in Neural Information Processing Systems 3, Morgan Kaufmann, pp.190-198, 1991.
  4. X. Z Gao, X. Wang, and S. J. Ovaska, "A novel hybrid optimization method with application in Cascade-Correlation neural network training," Proceedings, 8th International Conference on Hybrid Intelligent Systems, Article number 4626728, pp.793-800, 2008. https://doi.org/10.1109/HIS.2008.19
  5. B. Hammer, A. Micheli, and A. Sperduti, "Universal approximation capability of cascade correlation for structures," Neural Computation, Vol.17, No.5, pp.1109-1159, 2005. https://doi.org/10.1162/0899766053491878
  6. T. D. Le, T. Komeda, and M. Takagi, "Knowledge-based recurrent neural networks in reinforcement learning," Proceedings of the 11th IASTED International Conference on Artificial Intelligence and Soft Computing, pp.169-174, 2007.
  7. L. Prechelt, "PROBEN1-A Set of Neural Network Benchmark Problems and Benchmarking Rules," Technical Report 21/94, Department of Computer Science, University of Karlsruhe, 1999.
  8. Stuttgart Neural Network Simulator (SNNS), User Manual, Version 4.0, Institute for Parallel and Distributed High Performance Systems (IPVR), University of Stuttgart, 1998.
  9. A. Zell.: Simulationneuronaler Netze, Addison-Wesley, 1994.