DOI QR코드

DOI QR Code

Bayesian Analysis for Neural Network Models

  • Chung, Younshik (Department of Statistics and Research Institute of Computer and Information Communication, Pusan National University) ;
  • Jung, Jinhyouk (DB Marketing Team, Sejung Co.) ;
  • Kim, Chansoo (Research Institute of Computer and Information Communication, Pusan National University)
  • Published : 2002.04.01

Abstract

Neural networks have been studied as a popular tool for classification and they are very flexible. Also, they are used for many applications of pattern classification and pattern recognition. This paper focuses on Bayesian approach to feed-forward neural networks with single hidden layer of units with logistic activation. In this model, we are interested in deciding the number of nodes of neural network model with p input units, one hidden layer with m hidden nodes and one output unit in Bayesian setup for fixed m. Here, we use the latent variable into the prior of the coefficient regression, and we introduce the 'sequential step' which is based on the idea of the data augmentation by Tanner and Wong(1787). The MCMC method(Gibbs sampler and Metropolish algorithm) can be used to overcome the complicated Bayesian computation. Finally, a proposed method is applied to a simulated data.

Keywords

References

  1. Handbook of Satistics Logistic Discrimmination. In Classification, Pattern Recognition and Reduction of Dimensionality Anderson, J. A.;P. R. Krishnaiah(ed.);L.n. Kanal(ed.)
  2. Convergence Diagnosis and Output Analysis Software for Gibbs Sampling Output, Version 0.3 Best, N. G.;Cowles, M. K.;Vines, S. K.
  3. Complex System v.5 Bayesian back-propagation Buntine, W.;Weigend, A.
  4. The American Statistician v.49 Understanding the Metropolis-Hastings Algorithm Chib, S.;Greenberg, E. https://doi.org/10.2307/2684568
  5. Journal of Korean Statistical Society v.28 Bayesian outlier detection in regression model Chung, Y.;Kim, H.
  6. Journal of the American Statistical Association v.85 Sampling-Based Approaches to Calculating Marginal Densities Gelfand, A.E.;Smith, A. F. M. https://doi.org/10.2307/2289776
  7. Journal of the American Statistical Association v.88 Variable Selection Via Gibbs Sampling George, E. I.;McCulloch, R. E. https://doi.org/10.2307/2290777
  8. In Bayesian Statistics v.4 Evaluating the Accuracy of Sampling-Based Approaches to Calculating Posterior Moments Geweke, J.;J. M. Bernado(ed.);J. O. Berger(ed.);A. P. Dawid(ed.);A. F. M. Smith(ed.)
  9. Ph.D thesis, California Institute of Technology Bayesian Methods for Adaptive Methods Mackay, D. J. C.
  10. Bulletin of Mathematical Biophysics v.5 A Logical Calculus of the Ideas Immanent in Nervous Activity McCulloch, W. S.;Pitts, W. https://doi.org/10.1007/BF02478259
  11. Journal of Chemical Physics v.21 Equation of state calculations by fast computing machines Metropolis, N.;Rosenbluth, A.;Rosenbluth, M.;Teller, A.;Teller, E. https://doi.org/10.1063/1.1699114
  12. Practical Nonparametric and Semiparametric Bayesian Statistics Feedforward Neural Networks for Nonparametric Regression Muller, P.;Rios Insua. D.;D. Dey(eds.);P. Muller(eds.);D. Sinha(eds.)
  13. Neural Computation v.10 Issues in Bayesian Analysis of Neural Network Models Muller, P.;Rios Insua, D.
  14. Bayesian Learning gor Neural Networks Neal, R. M.
  15. IEEE Trans. Neural Networks v.2 A general Regression Neural Network Specht, D. F. https://doi.org/10.1109/72.97934
  16. Journal of the American Statistical Association v.82 The Calculation of Posterior Distributions by Data Augmentation Tanner, M. A.;Wong, W. H. https://doi.org/10.2307/2289457

Cited by

  1. Input Variable Importance in Supervised Learning Models vol.10, pp.1, 2003, https://doi.org/10.5351/CKSS.2003.10.1.239