Go to the main menu
Skip to content
Go to bottom
REFERENCE LINKING PLATFORM OF KOREA S&T JOURNALS
> Journal Vol & Issue
Communications for Statistical Applications and Methods
Journal Basic Information
Journal DOI :
The Korean Statistical Society
Editor in Chief :
Volume & Issues
Volume 16, Issue 6 - Nov 2009
Volume 16, Issue 5 - Sep 2009
Volume 16, Issue 4 - Jul 2009
Volume 16, Issue 3 - May 2009
Volume 16, Issue 2 - Mar 2009
Volume 16, Issue 1 - Jan 2009
Selecting the target year
Probabilistic Approach to Government Employee Pension System
Kim, Joo-Yoo ; Song, Seong-Joo ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 557~572
DOI : 10.5351/CKSS.2009.16.4.557
This article examines the financial soundness of the government employee pension system(GEPS). We use a model that simplifies the existing GEPS considering survival probability distribution of the life of employees. Two approaches were selected for the research: One is the expected net value of pension for an individual employee and the other is the default probability of the system from Monte-carlo simulation. The outcome reveals following three possibilities. First of all, the individual expected net value presents unfairness between the retiree's premium and the benefit he/she receives. Secondly, the Monte-carlo simulation suggests that the default is highly likely to happen in less than 30 years. Thirdly, the governmental reserve and subsidy for GEPS should be required to a certain degree in order to alleviate the probability of default less than 5 percent for the next 30 years.
Note on the Consistency of a Penalized Maximum Likelihood Estimate
Ahn, Sung-Mahn ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 573~578
DOI : 10.5351/CKSS.2009.16.4.573
We prove the consistency of a penalized maximum likelihood estimate proposed by Ahn (2001). The PMLE not only avoids the well-known problem that the ordinary likelihood of the normal mixture model is unbounded for any given sample size, but also removes redundant components.
A Study on the Comparison between E-MDR and D-MDR in Continuous Data
Lee, Jea-Young ; Lee, Ho-Guen ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 579~586
DOI : 10.5351/CKSS.2009.16.4.579
We have used multifactor dimensionality reduction(MDR) method to study interaction effect of statistical model in general. But MDR method cannot be applied in all cases. It can be applied to the only case-control data. So, two methods are suggested E-MDR and D-MDR method using regression tree algorithm and dummy variables. We applied the methods on the identify interaction effects of single nucleotide polymorphisms(SNPs) responsible for longissimus mulcle dorsi area(LMA), carcass cold weight(CWT) and average daily gain(ADG) in a Hanwoo beef cattle population. Finally, we compare the results using permutation test.
Test for Distribution Change of Dependent Errors
Na, Seong-Ryong ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 587~594
DOI : 10.5351/CKSS.2009.16.4.587
In this paper the change point problem of the error terms in linear regression models is considered. Since fixed or stochastic independent variables and weakly dependent errors are assumed, usual multiple regression models and time series models including ARMA are covered. We use the estimates of probability density function based on residuals in order to test the distribution change of the unobserved errors. Under some mild conditions, the test using the residuals is proved to have the same limiting distribution as the test based on true errors.
An Dynamic Optimal Allocation for the Stratified Randomized Response Technique
Son, Chang-Kyoon ; Hong, Ki-Hak ; Lee, Gi-Sung ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 595~603
DOI : 10.5351/CKSS.2009.16.4.595
Typically the standard optimal allocation method distributes the sample for each stratum considering survey cost. In case of varying survey cost for each survey unit, we need to consider more practical allocation method. In other words, according to characteristics of an individual unit, we consider the optimal dynamic allocation method which first selects the survey unit having maximum value of benefit cost ratio. In terms of this, the proposed allocation method is different from standard optimal allocation method which allocate samples for each stratum and selects the random sample according to each size of sample. This paper is considered the dynamic optimal allocation method for the stratified randomized response technique which surveys for sensitive characteristic of survey units such as drug abuse, abortion, alcoholic. We prove the practical usefulness of proposed method using the numerical example.
I-TGARCH Models and Persistent Volatilities with Applications to Time Series in Korea
Hong, S.Y. ; Choi, S.M. ; Park, J.A. ; Baek, J.S. ; Hwang, S.Y. ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 605~614
DOI : 10.5351/CKSS.2009.16.4.605
TGARCH models characterized by asymmetric volatilities have been useful for analyzing various time series in financial econometrics. We are concerned with persistent volatility in the TGARCH context. Park et al. (2009) introduced I-TGARCH process exhibiting a certain persistency in volatility. This article applies I-TGARCH model to various financial time series in Korea and it is obtained that I-TGARCH provides a better fit than competing models.
Window Configurations Comparison Based on Statistical Edge Detection in Images
Lim, Dong-Hoon ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 615~625
DOI : 10.5351/CKSS.2009.16.4.615
In this paper we describe Wilcoxon test and T-test that are well-known in two-sample location problem for detecting edges under different window configurations. The choice of window configurations is an important factor in determining the performance and the expense of edge detectors. Our edge detectors are based on testing the mean values of local neighborhoods obtained under the edge model using an edge-height parameter. We compare three window configurations based on statistical tests in terms of qualitative measures with the edge maps and objective, quantitative measures as well as CPU time for detecting edge.
A Study on a Relative Achievement Index of Foreign Language Ability
Heo, Sun-Yeong ; Chang, Duk-Joon ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 627~637
DOI : 10.5351/CKSS.2009.16.4.627
Our economic system highly depends on the international trade market and demands high level of foreign language abilities to the young generation. Scores of various internationally authorized language tests are used to estimate students's language ability before giving their admission to universities or high schools. According to increase of unemployment rate of the young people, universities put their strength on the language abilities of their students, and each university uses some authorized language scores before determining students who win scholarships. This study suggests two types of indices to evaluate students's relative achievement of foreign language abilities between two time points considering their current abilities. One, named one-side relative achievement index, is defined as the ratio of additional score between two time points to the remaining score at the base time point to the full score. The second one, named two-side relative achievement index, is defined as the same manner as the first one if the score is improved, and if not, is defined as the ratio of the amount of losing scores to the current score at the base time point. Two-side achievement index is more useful since it has smaller variation than the former and is easier to interpret. However, both indices are useful to compare the achievements of different tests.
No Arbitrage Condition for Multi-Facor HJM Model under the Fractional Brownian Motion
Rhee, Joon-Hee ; Kim, Yoon-Tae ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 639~645
DOI : 10.5351/CKSS.2009.16.4.639
Fractional Brwonian motion(fBm) has properties of behaving tails and exhibiting long memory while remaining Gaussian. In particular, it is well known that interest rates show some long memories and non-Markovian. We present no aribitrage condition for HJM model under the multi-factor fBm reflecting the long range dependence in the interest rate model.
LH-Moments of Some Distributions Useful in Hydrology
Murshed, Md. Sharwar ; Park, Byung-Jun ; Jeong, Bo-Yoon ; Park, Jeong-Soo ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 647~658
DOI : 10.5351/CKSS.2009.16.4.647
It is already known from the previous study that flood seems to have heavier tail. Therefore, to make prediction of future extreme label, some agreement of tail behavior of extreme data is highly required. The LH-moments estimation method, the generalized form of L-moments is an useful method of characterizing the upper part of the distribution. LH-moments are based on linear combination of higher order statistics. In this study, we have formulated LH-moments of five distributions useful in hydrology such as, two types of three parameter kappa distributions, beta-
distribution, beta-p distribution and a generalized Gumbel distribution. Using LH-moments reduces the undue influences that small sample may have on the estimation of large return period events.
Application of Generalized Maximum Entropy Estimator to the Two-way Nested Error Component Model with III-Posed Data
Cheon, Soo-Young ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 659~667
DOI : 10.5351/CKSS.2009.16.4.659
Recently Song and Cheon (2006) and Cheon and Lim (2009) developed the generalized maximum entropy(GME) estimator to solve ill-posed problems for the regression coefficients in the simple panel model. The models discussed consider the individual and a spatial autoregressive disturbance effects. However, in many application in economics the data may contain nested groupings. This paper considers a two-way error component model with nested groupings for the ill-posed data and proposes the GME estimator of the unknown parameters. The performance of this estimator is compared with the existing methods on the simulated dataset. The results indicate that the GME method performs the best in estimating the unknown parameters in terms of its quality when the data are ill-posed.
A Note on Determination of Sample Size for a Likert Scale
Park, Jin-Woo ; Jung, Mi-Sook ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 669~673
DOI : 10.5351/CKSS.2009.16.4.669
When a social scientist prepares to conduct a survey, he/she faces the problem of deciding an appropriate sample size. Sample size is closely connected with cost, time, and the precision of the sample estimate. It is thus important to choose a size appropriate for the survey, but this may be difficult for survey researchers not skilled in a sampling theory. In this study we propose a method to determine a sample size under certain assumptions when the quantity of interest is measured by a Likert scale.
Bootstrap Confidence Intervals of Classification Error Rate for a Block of Missing Observations
Chung, Hie-Choon ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 675~686
DOI : 10.5351/CKSS.2009.16.4.675
In this paper, it will be assumed that there are two distinct populations which are multivariate normal with equal covariance matrix. We also assume that the two populations are equally likely and the costs of misclassification are equal. The classification rule depends on the situation when the training samples include missing values or not. We consider the bootstrap confidence intervals for classification error rate when a block of observation is missing.
A Central Limit Theorem for the Linear Process in a Hilbert Space under Negative Association
Ko, Mi-Hwa ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 687~696
DOI : 10.5351/CKSS.2009.16.4.687
We prove a central limit theorem for the negatively associated random variables in a Hilbert space and extend this result to the linear process generated by negatively associated random variables in a Hilbert space. Our result implies an extension of the central limit theorem for the linear process in a real space under negative association to a simplest case of infinite dimensional Hilbert space.
Goodness-of-Fit Tests for the Ordinal Response Models with Misspecified Links
Jeong, Kwang-Mo ; Lee, Hyun-Yung ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 697~705
DOI : 10.5351/CKSS.2009.16.4.697
The Pearson chi-squared statistic or the deviance statistic is widely used in assessing the goodness-of-fit of the generalized linear models. But these statistics are not proper in the situation of continuous explanatory variables which results in the sparseness of cell frequencies. We propose a goodness-of-fit test statistic for the cumulative logit models with ordinal responses. We consider the grouping of a dataset based on the ordinal scores obtained by fitting the assumed model. We propose the Pearson chi-squared type test statistic, which is obtained from the cross-classified table formed by the subgroups of ordinal scores and the response categories. Because the limiting distribution of the chi-squared type statistic is intractable we suggest the parametric bootstrap testing procedure to approximate the distribution of the proposed test statistic.
Sign IV Cointegration Tests
Oh, Yu-Jin ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 707~711
DOI : 10.5351/CKSS.2009.16.4.707
We propose new cointegration tests using signs of the regressors as instrumental variable. Our tests have the asymptotic standard normal distribution and are free from the dimension of regressors under the null hypothesis of no cointegration. A Monte-Carlo simulation shows that the proposed tests have a stable size and an improved power. Particulary, the tests have better power for small numbers of observations.
Effective Computation for Odds Ratio Estimation in Nonparametric Logistic Regression
Kim, Young-Ju ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 713~722
DOI : 10.5351/CKSS.2009.16.4.713
The estimation of odds ratio and corresponding confidence intervals for case-control data have been done by traditional generalized linear models which assumed that the logarithm of odds ratio is linearly related to risk factors. We adapt a lower-dimensional approximation of Gu and Kim (2002) to provide a faster computation in nonparametric method for the estimation of odds ratio by allowing flexibility of the estimating function and its Bayesian confidence interval under the Bayes model for the lower-dimensional approximations. Simulation studies showed that taking larger samples with the lower-dimensional approximations help to improve the smoothing spline estimates of odds ratio in this settings. The proposed method can be used to analyze case-control data in medical studies.
The Comparison of Imputation Methods in Time Series Data with Missing Values
Lee, Sung-Duck ; Choi, Jae-Hyuk ; Kim, Duck-Ki ;
Communications for Statistical Applications and Methods, volume 16, issue 4, 2009, Pages 723~730
DOI : 10.5351/CKSS.2009.16.4.723
Missing values in time series can be treated as unknown parameters and estimated by maximum likelihood or as random variables and predicted by the expectation of the unknown values given the data. The purpose of this study is to impute missing values which are regarded as the maximum likelihood estimator and random variable in incomplete data and to compare with two methods using ARMA model. For illustration, the Mumps data reported from the national capital region monthly over the years 2001
2006 are used, and results from two methods are compared with using SSF(Sum of square for forecasting error).