Contour Plots of Objective Functions for Feed-Forward Neural Networks

- Journal title : International Journal of Contents
- Volume 8, Issue 4, 2012, pp.30-35
- Publisher : The Korea Contents Association
- DOI : 10.5392/IJoC.2012.8.4.030

Title & Authors

Contour Plots of Objective Functions for Feed-Forward Neural Networks

Oh, Sang-Hoon;

Oh, Sang-Hoon;

Abstract

Error surfaces provide us with very important information for training of feed-forward neural networks (FNNs). In this paper, we draw the contour plots of various error or objective functions for training of FNNs. Firstly, when applying FNNs to classifications, the weakness of mean-squared error is explained with the viewpoint of error contour plot. And the classification figure of merit, mean log-square error, cross-entropy error, and n-th order extension of cross-entropy error objective functions are considered for the contour plots. Also, the recently proposed target node method is explained with the viewpoint of contour plot. Based on the contour plots, we can explain characteristics of various error or objective functions when training of FNNs proceeds.

Keywords

Feed-forward Neural Network;Error Function;Objective Function;Contour Plot;

Language

English

Cited by

References

1.

K. Hornik, M. Stinchcombe, and H. White, "Multilayer Feed-forward Networks are Universal Approximators," Neural Networks, vol.2, 1989, pp. 359-366.

2.

K. Hornik, "Approximation Capabilities of Multilayer Feedforward Networks," Neural Networks, vol.4, 1991, pp. 251-257

3.

S. Suzuki, "Constructive Function Approximation by Three-Layer Artificial Neural Networks," Neural Networks, vol.11, 1998, pp. 1049-1058

4.

Y. Liao, S. C. Fang, H. L. W. Nuttle, "Relaxed Conditions for Radial-Basis Function Networks to be Universal Approximators," Neural Networks, vol.16, 2003, pp. 1019-1028

5.

D. E. Rumelhart and J. L. McClelland, Parallel Distributed Processing, Cambridge, MA, 1986.

6.

C. Wang and J. C. Princope, "Training Neural Networks with Additive Noise in the Desired Signal," IEEE Trans. Neural Networks, vol.10, 1999, pp. 1511-1517.

7.

J. B. Hampshare and A. H. Waibel, "A Novel Objective Function for Improved Phoneme Recognition Using Time-Delay Neural Networks," IEEE Trans. Neural Networks, vol.1, 1990, pp. 216-228.

8.

K. Liano, "Robust Error measure for Supervised Neural Network Learning with Outliers," IEEE Trans. Neural Networks, vol.7, 1996, pp. 246-250.

9.

A. van Ooyen and B. Nienhuis, "Improving the Convergence of the Backpropagation Algorithm," Neural Networks, vol.5, 1992, pp. 465-471.

10.

S.-H. Oh, "Improving the Error Back-Propagation Algorithm with a Modified Error Function," IEEE Trans. Neural Networks, vol.8, 1997, pp. 799-803.

11.

S.-H. Oh, "Error Back-Propagation Algorithm for Classification of Imbalanced Data," Neurocomputing, vol.74, 2011, pp. 1058-1061.

12.

H. White, "Learning in Artificial Neural Networks: A Statistical Perspective," Neural Computation, vol.1, no.4, Winter 1989, pp. 425-464.

13.

M. D. Richard and R. P. Lippmann, "Neural Network Classifier Estimate Bayesian a Posteriori Probabilities," Neural Computa., vol.3, 1991, pp. 461-483.

14.

S.-H. Oh, "Statistical Analyses of Various Error Functions for Pattern Classifiers," Proc. Convergence on Hybrid Information Technology, CCIS vol. 206, 2011, p. 129-133.

15.

S.-H. Oh, "A Statistical Perspective of Neural Networks for Imbalanced Data problems," Int. Journal of Contents, vol.7, 2011, pp. 1-5.