DOI QR코드

DOI QR Code

의사 샘플 신경망을 이용한 토석류 퇴적 모델의 파라미터 추정

Parameter Estimation in Debris Flow Deposition Model Using Pseudo Sample Neural Network

  • 허경용 (동의대학교 전자공학과) ;
  • 이창우 (국립산림과학원 산림방재연구과) ;
  • 박충식 (영동대학교 스마트IT학부)
  • 투고 : 2012.08.17
  • 심사 : 2012.10.18
  • 발행 : 2012.11.30

초록

토석류 퇴적 모델은 토석류에 의한 피해지 예측을 위해 random walk model(RWM)을 사용하여 구성한 모델로 피해지 예측에서 그 효용성이 입증되었지만 몇 개의 자유 파라미터가 실험적으로 결정되어야 하는 문제점이 있다. 파라미터를 자동으로 추정하기 위한 방법은 여러 가지가 있지만 토석류 데이터는 학습 데이터의 크기가 작아 기존 학습 기법을 적용하는데 어려움이 있다. 이 논문에서는 학습 데이터 크기 문제를 완화할 수 있는 신경망의 변형인 의사 샘플 신경망을 제안하였다. 의사 샘플 신경망은기존 샘플로부터 의사 샘플을생성하고 이를 학습에 사용한다. 의사 샘플은 해공간을 평탄화시키고 국부 최적해에 빠질 확률을 줄여줌으로써 기존 신경망에 비해 안정적인 파라미터 추정이 가능해진다. 이러한 사실은 실험 결과 통해 확인할 수 있다.

Debris flow deposition model is a model to predict affected areas by debris flow and random walk model (RWM) was used to build the model. Although the model was proved to be effective in the prediction of affected areas, the model has several free parameters decided experimentally. There are several well-known methods to estimate parameters, however, they cannot be applied directly to the debris flow problem due to the small size of training data. In this paper, a modified neural network, called pseudo sample neural network (PSNN), was proposed to overcome the sample size problem. In the training phase, PSNN uses pseudo samples, which are generated using the existing samples. The pseudo samples smooth the solution space and reduce the probability of falling into a local optimum. As a result, PSNN can estimate parameter more robustly than traditional neural networks do. All of these can be proved through the experiments using artificial and real data sets.

키워드

참고문헌

  1. L. Grady, "Random Walks for Image Segmentation," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 28, No. 11, pp. 1768-1783, Nov. 2006. https://doi.org/10.1109/TPAMI.2006.233
  2. Chang-Woo Lee, Choongshik Woo, and Ho-Joong Youn, "Analysis of Debris Flow Hazard Zone by the Optimal Parameters Extraction of Random Walk Model − Case on Debris Flow Area of Bonghwa County in Gyeongbuk Province," Journal of Korean Forest Society Vol. 100, No. 4, pp. 664-671, Apr. 2011.
  3. R.P.W. Duin, "Small sample size generalization," Proceedings of the 9th Scandinavian Conference on Image Analysis, pp. 957-964, Oct. 1995.
  4. S. Haykin, "Neural Networks: A Comprehensive Foundation," 2nd ed. Prentice Hall, 1998.
  5. C.M. Bishop, "Pattern Recognition and Machine Learning," 2nd ed. Springer, 2007
  6. R. Polikar, L. Udpa, S.S. Udpa, and V. Honavar, "Learn++: An Incremental Learning Algorithm for Supervised Neural Networks" IEEE Transactions on Systems,Man, and Cybernetics - Pact C: Applications and Reviews, Vol. 31, No. 4, pp. 497-508, Aug. 2001. https://doi.org/10.1109/5326.983933
  7. D. Foley, "Considerations of sample and feature size," IEEE Transactions on Information Theory, Vol. 18, No. 5, pp. 618-628, Oct. 1972. https://doi.org/10.1109/TIT.1972.1054863
  8. S. Uchimura, Y. Hamamoto, and S. Tomita, "Effects of the sample size in artificial neural network classifier design," Proceedings of the IEEE International Conference on Neural Networks, pp. 2126-2129, Dec. 1995.
  9. T.G. Niel, T.R. McVicar, and B. Datt, "On the relationship between training sample size and data dimensionality: Monte Carlo analysis of broadband multi-temporal classification," Remote Sensing of Environment, Vol. 98, No. 4, pp. 468-480, Oct. 2005. https://doi.org/10.1016/j.rse.2005.08.011
  10. D. Richard, "Probability: Theory and Examples," 4th ed. Cambridge University Press, 2004.

피인용 문헌

  1. 의사 샘플 신경망에서 학습 샘플 및 특징 선택 기법 vol.18, pp.4, 2013, https://doi.org/10.9708/jksci.2013.18.4.019