• Title/Summary/Keyword: selection of principal components

Search Result 51, Processing Time 0.027 seconds

Principal Component Regression by Principal Component Selection

  • Lee, Hosung;Park, Yun Mi;Lee, Seokho
    • Communications for Statistical Applications and Methods
    • /
    • v.22 no.2
    • /
    • pp.173-180
    • /
    • 2015
  • We propose a selection procedure of principal components in principal component regression. Our method selects principal components using variable selection procedures instead of a small subset of major principal components in principal component regression. Our procedure consists of two steps to improve estimation and prediction. First, we reduce the number of principal components using the conventional principal component regression to yield the set of candidate principal components and then select principal components among the candidate set using sparse regression techniques. The performance of our proposals is demonstrated numerically and compared with the typical dimension reduction approaches (including principal component regression and partial least square regression) using synthetic and real datasets.

Procedure for the Selection of Principal Components in Principal Components Regression (주성분회귀분석에서 주성분선정을 위한 새로운 방법)

  • Kim, Bu-Yong;Shin, Myung-Hee
    • The Korean Journal of Applied Statistics
    • /
    • v.23 no.5
    • /
    • pp.967-975
    • /
    • 2010
  • Since the least squares estimation is not appropriate when multicollinearity exists among the regressors of the linear regression model, the principal components regression is used to deal with the multicollinearity problem. This article suggests a new procedure for the selection of suitable principal components. The procedure is based on the condition index instead of the eigenvalue. The principal components corresponding to the indices are removed from the model if any condition indices are larger than the upper limit of the cutoff value. On the other hand, the corresponding principal components are included if any condition indices are smaller than the lower limit. The forward inclusion method is employed to select proper principal components if any condition indices are between the upper limit and the lower limit. The limits are obtained from the linear model which is constructed on the basis of the conjoint analysis. The procedure is evaluated by Monte Carlo simulation in terms of the mean square error of estimator. The simulation results indicate that the proposed procedure is superior to the existing methods.

A Criterion for the Selection of Principal Components in the Robust Principal Component Regression (로버스트주성분회귀에서 최적의 주성분선정을 위한 기준)

  • Kim, Bu-Yong
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.6
    • /
    • pp.761-770
    • /
    • 2011
  • Robust principal components regression is suggested to deal with both the multicollinearity and outlier problem. A main aspect of the robust principal components regression is the selection of an optimal set of principal components. Instead of the eigenvalue of the sample covariance matrix, a selection criterion is developed based on the condition index of the minimum volume ellipsoid estimator which is highly robust against leverage points. In addition, the least trimmed squares estimation is employed to cope with regression outliers. Monte Carlo simulation results indicate that the proposed criterion is superior to existing ones.

Logistic Regression Classification by Principal Component Selection

  • Kim, Kiho;Lee, Seokho
    • Communications for Statistical Applications and Methods
    • /
    • v.21 no.1
    • /
    • pp.61-68
    • /
    • 2014
  • We propose binary classification methods by modifying logistic regression classification. We use variable selection procedures instead of original variables to select the principal components. We describe the resulting classifiers and discuss their properties. The performance of our proposals are illustrated numerically and compared with other existing classification methods using synthetic and real datasets.

A Penalized Principal Components using Probabilistic PCA

  • Park, Chong-Sun;Wang, Morgan
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2003.05a
    • /
    • pp.151-156
    • /
    • 2003
  • Variable selection algorithm for principal component analysis using penalized likelihood method is proposed. We will adopt a probabilistic principal component idea to utilize likelihood function for the problem and use HARD penalty function to force coefficients of any irrelevant variables for each component to zero. Consistency and sparsity of coefficient estimates will be provided with results of small simulated and illustrative real examples.

  • PDF

Algorithm for Finding the Best Principal Component Regression Models for Quantitative Analysis using NIR Spectra (근적외 스펙트럼을 이용한 정량분석용 최적 주성분회귀모델을 얻기 위한 알고리듬)

  • Cho, Jung-Hwan
    • Journal of Pharmaceutical Investigation
    • /
    • v.37 no.6
    • /
    • pp.377-395
    • /
    • 2007
  • Near infrared(NIR) spectral data have been used for the noninvasive analysis of various biological samples. Nonetheless, absorption bands of NIR region are overlapped extensively. It is very difficult to select the proper wavelengths of spectral data, which give the best PCR(principal component regression) models for the analysis of constituents of biological samples. The NIR data were used after polynomial smoothing and differentiation of 1st order, using Savitzky-Golay filters. To find the best PCR models, all-possible combinations of available principal components from the given NIR spectral data were derived by in-house programs written in MATLAB codes. All of the extensively generated PCR models were compared in terms of SEC(standard error of calibration), $R^2$, SEP(standard error of prediction) and SECP(standard error of calibration and prediction) to find the best combination of principal components of the initial PCR models. The initial PCR models were found by SEC or Malinowski's indicator function and a priori selection of spectral points were examined in terms of correlation coefficients between NIR data at each wavelength and corresponding concentrations. For the test of the developed program, aqueous solutions of BSA(bovine serum albumin) and glucose were prepared and analyzed. As a result, the best PCR models were found using a priori selection of spectral points and the final model selection by SEP or SECP.

Application of varimax rotated principal component analysis in quantifying some zoometrical traits of a relict cow

  • Pares-Casanova, P.M.;Sinfreu, I.;Villalba, D.
    • Korean Journal of Veterinary Research
    • /
    • v.53 no.1
    • /
    • pp.7-10
    • /
    • 2013
  • A study was conducted to determine the interdependence among the conformation traits of 28 "Pallaresa" cows using principal component analysis. Originally 21 body linear measurements were obtained, from which eight traits are subsequently eliminated. From the principal components analysis, with raw varimax rotation of the transformation matrix, two principal components were extracted, which accounted for 65.8% of the total variance. The first principal component alone explained 51.6% of the variation, and tended to describe general size, while the second principal component had its loadings for back-sternal diameter. The two extracted principal components, which are traits related to dorsal heights and back-sternal diameter, could be considered in selection programs.

Observation on the shape of the neck -by principal component analysis of the mesurements- (피복 구성을 위한 경부 형태의 관찰)

  • 이연순
    • Journal of the Ergonomics Society of Korea
    • /
    • v.10 no.2
    • /
    • pp.31-42
    • /
    • 1991
  • To understand the shape of the neck in a view of garment planning, principal component analysis has been appliedto the measurement of the neck. The neck surface development and the cross sections of the neck have been observed. The materials consist of the body mearsurements, the neck surface developments and the cross sec- tions of the necks of a total of 108 korean woman students. The difference between the right side and the left side of the neck has not been reconginiged. But the differenece among the height of the front neck point, that of the side neck point and that of the back neck point has been recognized. 2. The initial 41 items have been found having variety and duplication. So two criteria have been made to solve those problems and the selection of 34 items have been made by each criterion. 3. 43 and 34 items have been compared by means of accumulative ratios of contribution and of clearness within the meaning of principal component. As a result, 34 measurement items have been further anylysis. 4. As a result of principal component analysis on the 34 items, the four principal components have been found obtaines and inter-preted. The four principal components are 1) the thick of the neck, 2) the front neck-line on the waist basic pattern, basic pattern, 3) the shape of the neck surface development, and 4) the back neck-line on the waist basic pattern. 5. According to the graphic informations concerning these principal components, the meaning of these four principal components has been grasped on the visual. As a result, there is a large individual difference in the shape of neck.

  • PDF

Gaussian Density Selection Method of CDHMM in Speaker Recognition (화자인식에서 연속밀도 은닉마코프모델의 혼합밀도 결정방법)

  • 서창우;이주헌;임재열;이기용
    • The Journal of the Acoustical Society of Korea
    • /
    • v.22 no.8
    • /
    • pp.711-716
    • /
    • 2003
  • This paper proposes the method to select the number of optimal mixtures in each state in Continuous Density HMM (Hidden Markov Models), Previously, researchers used the same number of mixture components in each state of HMM regardless spectral characteristic of speaker, To model each speaker as accurately as possible, we propose to use a different number of mixture components for each state, Selection of mixture components considered the probability value of mixture by each state that affects much parameter estimation of continuous density HMM, Also, we use PCA (principal component analysis) to reduce the correlation and obtain the system' stability when it is reduced the number of mixture components, We experiment it when the proposed method used average 10% small mixture components than the conventional HMM, When experiment result is only applied selection of mixture components, the proposed method could get the similar performance, When we used principal component analysis, the feature vector of the 16 order could get the performance decrease of average 0,35% and the 25 order performance improvement of average 0.65%.

Regional Geological Mapping by Principal Component Analysis of the Landsat TM Data in a Heavily Vegetated Area (식생이 무성한 지역에서의 Principal Component Analysis 에 의한 Landsat TM 자료의 광역지질도 작성)

  • 朴鍾南;徐延熙
    • Korean Journal of Remote Sensing
    • /
    • v.4 no.1
    • /
    • pp.49-60
    • /
    • 1988
  • Principal Component Analysis (PCA) was applied for regional geological mapping to a multivariate data set of the Landsat TM data in the heavily vegetated and topographically rugged Chungju area. The multivariate data set selection was made by statistical analysis based on the magnitude of regression of squares in multiple regression, and it includes R1/2/R3/4, R2/3, R5/7/R4/3, R1/2, R3/4. R4/3. AND R4/5. As a result of application of PCA, some of later principal components (in this study PC 3 and PC 5) are geologically more significant than earlier major components, PC 1 and PC 2 herein. The earlier two major components which comprise 96% of the total information of the data set, mainly represent reflectance of vegetation and topographic effects, while though the rest represent 3% of the total information which statistically indicates the information unstable, geological significance of PC3 and PC5 in the study implies that application of the technique in more favorable areas should lead to much better results.