JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Contour Plots of Objective Functions for Feed-Forward Neural Networks
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Contour Plots of Objective Functions for Feed-Forward Neural Networks
Oh, Sang-Hoon;
  PDF(new window)
 Abstract
Error surfaces provide us with very important information for training of feed-forward neural networks (FNNs). In this paper, we draw the contour plots of various error or objective functions for training of FNNs. Firstly, when applying FNNs to classifications, the weakness of mean-squared error is explained with the viewpoint of error contour plot. And the classification figure of merit, mean log-square error, cross-entropy error, and n-th order extension of cross-entropy error objective functions are considered for the contour plots. Also, the recently proposed target node method is explained with the viewpoint of contour plot. Based on the contour plots, we can explain characteristics of various error or objective functions when training of FNNs proceeds.
 Keywords
Feed-forward Neural Network;Error Function;Objective Function;Contour Plot;
 Language
English
 Cited by
 References
1.
K. Hornik, M. Stinchcombe, and H. White, "Multilayer Feed-forward Networks are Universal Approximators," Neural Networks, vol.2, 1989, pp. 359-366. crossref(new window)

2.
K. Hornik, "Approximation Capabilities of Multilayer Feedforward Networks," Neural Networks, vol.4, 1991, pp. 251-257 crossref(new window)

3.
S. Suzuki, "Constructive Function Approximation by Three-Layer Artificial Neural Networks," Neural Networks, vol.11, 1998, pp. 1049-1058 crossref(new window)

4.
Y. Liao, S. C. Fang, H. L. W. Nuttle, "Relaxed Conditions for Radial-Basis Function Networks to be Universal Approximators," Neural Networks, vol.16, 2003, pp. 1019-1028 crossref(new window)

5.
D. E. Rumelhart and J. L. McClelland, Parallel Distributed Processing, Cambridge, MA, 1986.

6.
C. Wang and J. C. Princope, "Training Neural Networks with Additive Noise in the Desired Signal," IEEE Trans. Neural Networks, vol.10, 1999, pp. 1511-1517. crossref(new window)

7.
J. B. Hampshare and A. H. Waibel, "A Novel Objective Function for Improved Phoneme Recognition Using Time-Delay Neural Networks," IEEE Trans. Neural Networks, vol.1, 1990, pp. 216-228. crossref(new window)

8.
K. Liano, "Robust Error measure for Supervised Neural Network Learning with Outliers," IEEE Trans. Neural Networks, vol.7, 1996, pp. 246-250. crossref(new window)

9.
A. van Ooyen and B. Nienhuis, "Improving the Convergence of the Backpropagation Algorithm," Neural Networks, vol.5, 1992, pp. 465-471. crossref(new window)

10.
S.-H. Oh, "Improving the Error Back-Propagation Algorithm with a Modified Error Function," IEEE Trans. Neural Networks, vol.8, 1997, pp. 799-803. crossref(new window)

11.
S.-H. Oh, "Error Back-Propagation Algorithm for Classification of Imbalanced Data," Neurocomputing, vol.74, 2011, pp. 1058-1061. crossref(new window)

12.
H. White, "Learning in Artificial Neural Networks: A Statistical Perspective," Neural Computation, vol.1, no.4, Winter 1989, pp. 425-464. crossref(new window)

13.
M. D. Richard and R. P. Lippmann, "Neural Network Classifier Estimate Bayesian a Posteriori Probabilities," Neural Computa., vol.3, 1991, pp. 461-483. crossref(new window)

14.
S.-H. Oh, "Statistical Analyses of Various Error Functions for Pattern Classifiers," Proc. Convergence on Hybrid Information Technology, CCIS vol. 206, 2011, p. 129-133.

15.
S.-H. Oh, "A Statistical Perspective of Neural Networks for Imbalanced Data problems," Int. Journal of Contents, vol.7, 2011, pp. 1-5. crossref(new window)

16.
N. Baba, "A New Approach for Finding the Global Minimum of Error Function of Neural Networks," Neural Networks, vol.2, 1989, pp. 367-373. crossref(new window)

17.
Y. Lee, S.-H. Oh, and M. W. Kim, "An Analysis of Premature Saturation in Back-Propagation Learning," Neural Networks, vol.6, 1993, pp. 719-728. crossref(new window)