JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Using Support Vector Regression for Optimization of Black-box Objective Functions
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Using Support Vector Regression for Optimization of Black-box Objective Functions
Kwak, Min-Jung; Yoon, Min;
  PDF(new window)
 Abstract
In many practical engineering design problems, the form of objective functions is not given explicitly in terms of design variables. Given the value of design variables, under this circumstance, the value of objective functions is obtained by real/computational experiments such as structural analysis, fluid mechanic analysis, thermodynamic analysis, and so on. These experiments are, in general, considerably expensive. In order to make the number of these experiments as few as possible, optimization is performed in parallel with predicting the form of objective functions. Response Surface Methods (RSM) are well known along this approach. This paper suggests to apply Support Vector Machines (SVM) for predicting the objective functions. One of most important tasks in this approach is to allocate sample data moderately in order to make the number of experiments as small as possible. It will be shown that the information of support vector can be used effectively to this aim. The effectiveness of our suggested method will be shown through numerical example which is well known in design of engineering.
 Keywords
Support vector regression;genetic algorithm;global and local information;optimization;
 Language
Korean
 Cited by
 References
1.
Cristianini, N. and Shawe-Taylor, J. (2000). An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press

2.
Eshelman, L. J. and Schaffer, J. D. (1993). Real-coded genetic algorithms and interval-schemata. In Foundations of Genetic Algorithms 2 (Whitely, L. D., ed.), Morgan Kaufman

3.
Haykin, S. (1999). Neural Networks: A Comprehensive Foundation. 2nd ed., Prentice Hall

4.
Hsu, Y. H., Sun, T. L. and Leu, L. H. (2000) A two-stage sequential approxi-mation method for nonlinear discrete variable optimization, ASME Design Engineering Technical Conference, Boston, 197-202

5.
Jones, D. R., Schonlau, M. and Welch, W. J. (1998). Efficient global opti-mization of expensive black-box functions. Journal of Global Optimization, 13, 455-492 crossref(new window)

6.
Kannan, B. K. and Kramer, S. N. (1994). An augmented lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. Journal of Mechanical Design, 116, 405-411 crossref(new window)

7.
Myers, R. H. and Montgomery, D. C. (1995). Response Surface Methodology: Process and Product Optimization using Designed Experiments. John Wiley & Sons, New York

8.
Nakayama, H., Arakawa, M. and Sasaki, K. (2002). Simulation based optimiza-tion for unknown objective functions. Optimization and Engineering, 3, 201-214 crossref(new window)

9.
Radcliffe, N. J. (1991). Forma analysis and random respectful recombination. In Proceedings of the Forth International Conference on Genetic Algorithms, 222-229

10.
Sacks, J., Welch, W. J., Mitchell, T. J. and Wynn, H. P. (1989). Design and analysis of computer experiments. Statistical Science, 4, 409-435 crossref(new window)

11.
Schonlau, M. (1997). Computer Experiments and Global Optimization. PhD. thesis, University of Waterloo, Ontario, Canada

12.
Scholkopf, B. and Smola, A. J. (2002), Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. The MIT Press

13.
Zabinsky, Z. B. (1998). Stochastic methods for practical global optimization. Journal of Global Optimization, 13, 433-444 crossref(new window)

14.
Zhang, C. and Wang, H. P. (1993). Mixed-Discrete nonlinear optimization with simulated annealing. Engineering Optimization, 21, 277-291 crossref(new window)