Using Support Vector Regression for Optimization of Black-box Objective Functions

- Journal title : Communications for Statistical Applications and Methods
- Volume 15, Issue 1, 2008, pp.125-136
- Publisher : The Korean Statistical Society
- DOI : 10.5351/CKSS.2008.15.1.125

Title & Authors

Using Support Vector Regression for Optimization of Black-box Objective Functions

Kwak, Min-Jung; Yoon, Min;

Kwak, Min-Jung; Yoon, Min;

Abstract

In many practical engineering design problems, the form of objective functions is not given explicitly in terms of design variables. Given the value of design variables, under this circumstance, the value of objective functions is obtained by real/computational experiments such as structural analysis, fluid mechanic analysis, thermodynamic analysis, and so on. These experiments are, in general, considerably expensive. In order to make the number of these experiments as few as possible, optimization is performed in parallel with predicting the form of objective functions. Response Surface Methods (RSM) are well known along this approach. This paper suggests to apply Support Vector Machines (SVM) for predicting the objective functions. One of most important tasks in this approach is to allocate sample data moderately in order to make the number of experiments as small as possible. It will be shown that the information of support vector can be used effectively to this aim. The effectiveness of our suggested method will be shown through numerical example which is well known in design of engineering.

Keywords

Support vector regression;genetic algorithm;global and local information;optimization;

Language

Korean

References

1.

Cristianini, N. and Shawe-Taylor, J. (2000). An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press

2.

Eshelman, L. J. and Schaffer, J. D. (1993). Real-coded genetic algorithms and interval-schemata. In Foundations of Genetic Algorithms 2 (Whitely, L. D., ed.), Morgan Kaufman

3.

Haykin, S. (1999). Neural Networks: A Comprehensive Foundation. 2nd ed., Prentice Hall

4.

Hsu, Y. H., Sun, T. L. and Leu, L. H. (2000) A two-stage sequential approxi-mation method for nonlinear discrete variable optimization, ASME Design Engineering Technical Conference, Boston, 197-202

5.

Jones, D. R., Schonlau, M. and Welch, W. J. (1998). Efficient global opti-mization of expensive black-box functions. Journal of Global Optimization, 13, 455-492

6.

Kannan, B. K. and Kramer, S. N. (1994). An augmented lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. Journal of Mechanical Design, 116, 405-411

7.

Myers, R. H. and Montgomery, D. C. (1995). Response Surface Methodology: Process and Product Optimization using Designed Experiments. John Wiley & Sons, New York

8.

Nakayama, H., Arakawa, M. and Sasaki, K. (2002). Simulation based optimiza-tion for unknown objective functions. Optimization and Engineering, 3, 201-214

9.

Radcliffe, N. J. (1991). Forma analysis and random respectful recombination. In Proceedings of the Forth International Conference on Genetic Algorithms, 222-229

10.

Sacks, J., Welch, W. J., Mitchell, T. J. and Wynn, H. P. (1989). Design and analysis of computer experiments. Statistical Science, 4, 409-435

11.

Schonlau, M. (1997). Computer Experiments and Global Optimization. PhD. thesis, University of Waterloo, Ontario, Canada

12.

Scholkopf, B. and Smola, A. J. (2002), Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. The MIT Press