Advanced SearchSearch Tips
Comparison of Objective Functions for Feed-forward Neural Network Classifiers Using Receiver Operating Characteristics Graph
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
  • Journal title : International Journal of Contents
  • Volume 10, Issue 1,  2014, pp.23-28
  • Publisher : The Korea Contents Association
  • DOI : 10.5392/IJoC.2014.10.1.023
 Title & Authors
Comparison of Objective Functions for Feed-forward Neural Network Classifiers Using Receiver Operating Characteristics Graph
Oh, Sang-Hoon; Wakuya, Hiroshi;
  PDF(new window)
When developing a classifier using various objective functions, it is important to compare the performances of the classifiers. Although there are statistical analyses of objective functions for classifiers, simulation results can provide us with direct comparison results and in this case, a comparison criterion is considerably critical. A Receiver Operating Characteristics (ROC) graph is a simulation technique for comparing classifiers and selecting a better one based on a performance. In this paper, we adopt the ROC graph to compare classifiers trained by mean-squared error, cross-entropy error, classification figure of merit, and the n-th order extension of cross-entropy error functions. After the training of feed-forward neural networks using the CEDAR database, the ROC graphs are plotted to help us identify which objective function is better.
Receiver Operating Characteristic Graph;Feed-Forward Neural Networks;Objective Function;Performance Comparison;
 Cited by
K. Hornik, M. Stincombe, and H. White, "Multilayer feedforward networks are universal approximators," Neural Networks, vol. 2, 1989, pp. 359-366. crossref(new window)

K. Hornik, "Approxiamtion Capabilitis of Multilayer Feedforward Networks," Neural Networks, vol. 4, 1991, pp. 251-257. crossref(new window)

S. Suzuki, "Constructive Function Approximation by Three-layer Artificial Neural Networks," Neural Networks, vol. 11, 1998, pp. 1049-1058. crossref(new window)

Y. Liao, S. C. Fang, and H. L. W. Nuttle, "Relaxed Conditions for Radial-Basis Function Networks to be Universal Approximators," Neural Networks, vol. 16, 2003, pp. 1019-1028. crossref(new window)

D. E. Rumelhart and J. L. McClelland, Parallel Distributed Processing, Cambridge, MA, 1986.

A. van Ooyen and B. Nienhuis, "Improving the Convergence of the Backpropagation Algorithm," Neural Networks, vol. 5, 1992, pp. 465-471. crossref(new window)

S. H. Oh, "Improving the Error Back-Propagation Algorithm with a Modified Error Function," IEEE Trans. Neural Networks, vol. 8, 1997, pp. 799-803. crossref(new window)

J. B. Hampshare and A. H. Waibel, "A Novel Objective Function for Improved Phoneme Recognition Using Time Delay Neural Networks," IEEE Trans. Neural Networks, vol. 1, 1990, pp. 216-228. crossref(new window)

S. H. Oh, "Contour Plots of Objective Functions for Feed-Forward Neural Networks," Int. Journal of Contents, vol. 8, no. 4, 2012, pp. 30-35.

H. White, "Learning in Artificial Neural Networks: A Statistical Perspective," Neural Computation, vol. 1, no. 4, 1989, pp. 425-464. crossref(new window)

M. D. Richard and R. P. Lippmann, "Neural Network Classifier Estimate Bayesian a posteriori probabilities," Neural Computation, vol. 3, 1991, pp. 461-483. crossref(new window)

S. H. Oh, "Statistical Analyses of Various Error functions for Pattern Classifiers," Proc. Convergence on Hybrid Information Technology, CCIS vol. 206, 2011, pp. 129-133.

S. H. Oh, "A Statistical Perspective of Neural Networks for Imbalanced Data Problems," Int. Journal of Contents, vol. 7, 2011, pp. 1-5.

I. Kononenko and I. Bratko, "Information-Based Evaluation criterion for Classifier's Performance," Machine Learning, vol. 6, 1991, pp. 67-80.

T. Fawcett, "An Introduction to ROC Analysis," Pattern Recognition Letters, vol. 27, 2006, pp. 861-874. crossref(new window)

R. N. McDonough and A. D. Whalen, Detection of Signals in Noise, Academic Press, 1995.

Z. H. Michalopoulou, L. W. Nolte, and D. Alexandrou, "Performance Evaluation of Multilayer Perceptrons in Signal Detection and Classification," IEEE Trans. Neural Networks, vol. 6, 1995, pp. 381-386. crossref(new window)

R. Bi, Y. Zhou, F. Lu, and W. Wang, "Predicting gene ontology functions based on support vector machines and statistical significance estimation," Neurocomputing, vol. 70, 2007, pp. 718-725. crossref(new window)

L. Bruzzone and S. B. Serpico, "Classification of Remote-Sensing Data by Neural Networks," Pattern Recognition Letters, vol. 18, 1997, pp. 1323-1328. crossref(new window)

Y. M. Huang, C. M. Hung, and H. C. Jiau, "Evaluation of Neural Networks and Data Mining Methods on a Credit Assessment Task for Class Imbalance Problem," Nonlinear Analysis, vol. 7, 2006, pp. 720-747. crossref(new window)

S. H. Oh, "Error back-propagation algorithm for classification of imbalanced data", Neurocomputing, vol. 74, 2011, pp. 1058-1061. crossref(new window)

Y. Lee, S. H. Oh, and M. W. Kim, "An Analysis of Premature Saturation in Back-Propagation Learning," Neural Networks, vol. 6, 1993, pp. 719-728. crossref(new window)

J. J. Hull, "A Database for Handwritten Text recognition research," IEEE Trans. Pattern Anal. Machine Intell., vol. 16, 1994, pp. 550-554. crossref(new window)

Y. Bengio, "Learning Deep Architecture for AI," Foundations and Trends in Machine Learning, vol.2, 2009, pp. 1-127. crossref(new window)