Online abnormal events detection with online support vector machine

온라인 서포트벡터기계를 이용한 온라인 비정상 사건 탐지

  • Received : 2011.01.11
  • Accepted : 2011.02.28
  • Published : 2011.03.31

Abstract

The ability to detect online abnormal events in signals is essential in many real-world signal processing applications. In order to detect abnormal events, previously known algorithms require an explicit signal statistical model, and interpret abnormal events as statistical model abrupt changes. In general, maximum likelihood and Bayesian estimation theory to estimate well as detection methods have been used. However, the above-mentioned methods for robust and tractable model, it is not easy to estimate. More freedom to estimate how the model is needed. In this paper, we investigate a machine learning, descriptor-based approach that does not require a explicit descriptors statistical model, based on support vector machines are known to be robust statistical models and a sequential optimal algorithm online support vector machine is introduced.

신호처리 관련 응용문제에서는 신호에서 실시간으로 발생하는 비정상적인 사건들을 탐지하는 것이 매우 중요하다. 이전에 알려져 있는 비정상 사건 탐지방법들은 신호에 대한 명확한 통계적인 모형을 가정하고, 비정상적인 신호들은 통계적인 모형의 가정 하에서 비정상적인 사건들로 해석한다. 탐지방법으로 최대우도와 베이즈 추정 이론이 많이 사용되고 있다. 그러나 앞에서 언급한 방법으로는 로버스트 하고 다루기 쉬운 모형을 추정한다는 것은 쉽지가 않다. 좀 더 로버스트한 모형을 추정할 수 있는 방법이 필요하다. 본 논문에서는 로버스트 하다고 알려져 있는 서포트 벡터 기계를 이용하여 온라인으로 비정상적인 신호를 탐지하는 방법을 제안한다.

Keywords

References

  1. Duda, R. O. and Hart, P. E. (1973). Pattern classification and scene analysis, Wiley, New York.
  2. Herbrich, R. (2002). Learning kernel classifiers- Theory and algorithms, MIT Press, Cambridge, MA.
  3. Hwang, H. (2010). Fixed size LS-SVM for multiclassification problems of large data sets. Journal of Korean Data & Information Science Society, 21, 1561-567.
  4. Parzen, E. (1962). On the estimation of a probability density function and the mode. Annals of Mathematical Statistics, 33, 1065-1076. https://doi.org/10.1214/aoms/1177704472
  5. Scholkopf, B. and Smola, A. (2002). Learning with kernels- Support vector machines, regularization, optimizations, and beyond, MIT Press, Cambridge, MA.
  6. Seok, K. H. (2010). Semi-supervised classification with LS-SVM formulation. Journal of Korean Data & Information Science Society, 21, 461-470.
  7. Shim, J. and Lee, J. T. (2009). Kernel method for autoregressive data. Journal of Korean Data & Information Science Society, 20, 949-964.
  8. Shim, J., Park, H. J. and Seok, K. H. (2009). Variance function estimation with LS-SVM for replicated data. Journal of Korean Data & Information Science Society, 20, 925-931.
  9. Silverman, B. (1986). Density estimation for statistics and data analysis, Chapman and Hall, New York.
  10. Smola, A. J. and Scholkopf, B. (2004). A tutorial on support vector regression. Statistics and Computing, 14, 199-222. https://doi.org/10.1023/B:STCO.0000035301.49549.88
  11. Suykens, J. A. K. and Vandewalle, J. (1999). Least squares support vector machine classifiers. Neural Processing Letters, 9, 293-300. https://doi.org/10.1023/A:1018628609742
  12. Suykens, J. A. K., Gastel, T. V., Bravanter, J. D., Moore, B. D. and Vandewalle, J. (2002). Least squares support vector machines, World Scientific.
  13. Tax, D. and Duin, R. (1999). Support vector domain description. Pattern Recognition Letters, 20, 1191-1199. https://doi.org/10.1016/S0167-8655(99)00087-2
  14. Vapnik, V. (1995). The nature of statistical learning theory, Springer, New York.
  15. Vapnik, V. (1998). Statistical learning theory, Wiley, New York.