• Title/Summary/Keyword: Penalty function

Search Result 292, Processing Time 0.026 seconds

A Penalized Principal Component Analysis using Simulated Annealing

  • Park, Chongsun;Moon, Jong Hoon
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.3
    • /
    • pp.1025-1036
    • /
    • 2003
  • Variable selection algorithm for principal component analysis using penalty function is proposed. We use the fact that usual principal component problem can be expressed as a maximization problem with appropriate constraints and we will add penalty function to this maximization problem. Simulated annealing algorithm is used in searching for optimal solutions with penalty functions. Comparisons between several well-known penalty functions through simulation reveals that the HARD penalty function should be suggested as the best one in several aspects. Illustrations with real and simulated examples are provided.

SMOOTHING APPROXIMATION TO l1 EXACT PENALTY FUNCTION FOR CONSTRAINED OPTIMIZATION PROBLEMS

  • BINH, NGUYEN THANH
    • Journal of applied mathematics & informatics
    • /
    • v.33 no.3_4
    • /
    • pp.387-399
    • /
    • 2015
  • In this paper, a new smoothing approximation to the l1 exact penalty function for constrained optimization problems (COP) is presented. It is shown that an optimal solution to the smoothing penalty optimization problem is an approximate optimal solution to the original optimization problem. Based on the smoothing penalty function, an algorithm is presented to solve COP, with its convergence under some conditions proved. Numerical examples illustrate that this algorithm is efficient in solving COP.

AN EXACT LOGARITHMIC-EXPONENTIAL MULTIPLIER PENALTY FUNCTION

  • Lian, Shu-jun
    • Journal of applied mathematics & informatics
    • /
    • v.28 no.5_6
    • /
    • pp.1477-1487
    • /
    • 2010
  • In this paper, we give a solving approach based on a logarithmic-exponential multiplier penalty function for the constrained minimization problem. It is proved exact in the sense that the local optimizers of a nonlinear problem are precisely the local optimizers of the logarithmic-exponential multiplier penalty problem.

Weighted Support Vector Machines with the SCAD Penalty

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.20 no.6
    • /
    • pp.481-490
    • /
    • 2013
  • Classification is an important research area as data can be easily obtained even if the number of predictors becomes huge. The support vector machine(SVM) is widely used to classify a subject into a predetermined group because it gives sound theoretical background and better performance than other methods in many applications. The SVM can be viewed as a penalized method with the hinge loss function and penalty functions. Instead of $L_2$ penalty function Fan and Li (2001) proposed the smoothly clipped absolute deviation(SCAD) satisfying good statistical properties. Despite the ability of SVMs, they have drawbacks of non-robustness when there are outliers in the data. We develop a robust SVM method using a weight function with the SCAD penalty function based on the local quadratic approximation. We compare the performance of the proposed SVM with the SVM using the $L_1$ and $L_2$ penalty functions.

An Optimization Technique For Crane Acceleration Using A Genetic Algorithm (유전자알고리즘을 이용한 크레인가속도 최적화)

  • 박창권;김재량;정원지;홍대선;권장렬;박범석
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2003.06a
    • /
    • pp.1701-1704
    • /
    • 2003
  • This paper presents a new optimization technique of acceleration curve for a wafer transfer crane movement in which high speed and low vibration are desirable. This technique is based on a genetic algorithm with a penalty function for acceleration optimization under the assumption that an initial profile of acceleration curves constitutes the first generation of the genetic algorithm. Especially the penalty function consists of the violation of constraints and the number of violated constraints. The proposed penalty function makes the convergence rate of optimization process using the genetic algorithm more faster than the case of genetic algorithm without a penalty function. The optimized acceleration of the crane through the genetic algorithm and commercial dynamic analysis software has shown to have accurate movement and low vibration.

  • PDF

Optimal Scheduling Algorithm for Minimizing the Quadratic Penalty Function of Completion Times (작업 완료시간의 2차벌과금함수를 최소화하는 알고리즘에 관한 연구)

  • 노인규;이정환
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.13 no.22
    • /
    • pp.35-42
    • /
    • 1990
  • This paper deals with a single machine scheduling problem with a quadratic penalty function of completion times. The objective is to find a optimal sequence which minimizes the total penalty. A new type of node elimination procedure and precedence relation is developed that determines the ordering between adjacent jobs and is incorporated into a branch and bound algorithm. In addition, modified penalty function is considered and numerical examples are provided to test the effectiveness of the optimum algorithm.

  • PDF

Variable Selection in Sliced Inverse Regression Using Generalized Eigenvalue Problem with Penalties

  • Park, Chong-Sun
    • Communications for Statistical Applications and Methods
    • /
    • v.14 no.1
    • /
    • pp.215-227
    • /
    • 2007
  • Variable selection algorithm for Sliced Inverse Regression using penalty function is proposed. We noted SIR models can be expressed as generalized eigenvalue decompositions and incorporated penalty functions on them. We found from small simulation that the HARD penalty function seems to be the best in preserving original directions compared with other well-known penalty functions. Also it turned out to be effective in forcing coefficient estimates zero for irrelevant predictors in regression analysis. Results from illustrative examples of simulated and real data sets will be provided.

Variable Selection with Nonconcave Penalty Function on Reduced-Rank Regression

  • Jung, Sang Yong;Park, Chongsun
    • Communications for Statistical Applications and Methods
    • /
    • v.22 no.1
    • /
    • pp.41-54
    • /
    • 2015
  • In this article, we propose nonconcave penalties on a reduced-rank regression model to select variables and estimate coefficients simultaneously. We apply HARD (hard thresholding) and SCAD (smoothly clipped absolute deviation) symmetric penalty functions with singularities at the origin, and bounded by a constant to reduce bias. In our simulation study and real data analysis, the new method is compared with an existing variable selection method using $L_1$ penalty that exhibits competitive performance in prediction and variable selection. Instead of using only one type of penalty function, we use two or three penalty functions simultaneously and take advantages of various types of penalty functions together to select relevant predictors and estimation to improve the overall performance of model fitting.

Edge-Preserving Iterative Reconstruction in Transmission Tomography Using Space-Variant Smoothing (투과 단층촬영에서 공간가변 평활화를 사용한 경계보존 반복연산 재구성)

  • Jung, Ji Eun;Ren, Xue;Lee, Soo-Jin
    • Journal of Biomedical Engineering Research
    • /
    • v.38 no.5
    • /
    • pp.219-226
    • /
    • 2017
  • Penalized-likelihood (PL) reconstruction methods for transmission tomography are known to provide improved image quality for reduced dose level by efficiently smoothing out noise while preserving edges. Unfortunately, however, most of the edge-preserving penalty functions used in conventional PL methods contain at least one free parameter which controls the shape of a non-quadratic penalty function to adjust the sensitivity of edge preservation. In this work, to avoid difficulties in finding a proper value of the free parameter involved in a non-quadratic penalty function, we propose a new adaptive method of space-variant smoothing with a simple quadratic penalty function. In this method, the smoothing parameter is adaptively selected for each pixel location at each iteration by using the image roughness measured by a pixel-wise standard deviation image calculated from the previous iteration. The experimental results demonstrate that our new method not only preserves edges, but also suppresses noise well in monotonic regions without requiring additional processes to select free parameters that may otherwise be included in a non-quadratic penalty function.

Variable Selection in PLS Regression with Penalty Function (벌점함수를 이용한 부분최소제곱 회귀모형에서의 변수선택)

  • Park, Chong-Sun;Moon, Guy-Jong
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.4
    • /
    • pp.633-642
    • /
    • 2008
  • Variable selection algorithm for partial least square regression using penalty function is proposed. We use the fact that usual partial least square regression problem can be expressed as a maximization problem with appropriate constraints and we will add penalty function to this maximization problem. Then simulated annealing algorithm can be used in searching for optimal solutions of above maximization problem with penalty functions added. The HARD penalty function would be suggested as the best in several aspects. Illustrations with real and simulated examples are provided.