Go to the main menu
Skip to content
Go to bottom
REFERENCE LINKING PLATFORM OF KOREA S&T JOURNALS
> Journal Vol & Issue
Communications for Statistical Applications and Methods
Journal Basic Information
Journal DOI :
The Korean Statistical Society
Editor in Chief :
Volume & Issues
Volume 23, Issue 4 - Jul 2016
Volume 23, Issue 3 - May 2016
Volume 23, Issue 2 - Mar 2016
Volume 23, Issue 1 - Jan 2016
Selecting the target year
Practice of causal inference with the propensity of being zero or one: assessing the effect of arbitrary cutoffs of propensity scores
Kang, Joseph ; Chan, Wendy ; Kim, Mi-Ok ; Steiner, Peter M. ;
Communications for Statistical Applications and Methods, volume 23, issue 1, 2016, Pages 1~20
DOI : 10.5351/CSAM.2016.23.1.001
Causal inference methodologies have been developed for the past decade to estimate the unconfounded effect of an exposure under several key assumptions. These assumptions include, but are not limited to, the stable unit treatment value assumption, the strong ignorability of treatment assignment assumption, and the assumption that propensity scores be bounded away from zero and one (the positivity assumption). Of these assumptions, the first two have received much attention in the literature. Yet the positivity assumption has been recently discussed in only a few papers. Propensity scores of zero or one are indicative of deterministic exposure so that causal effects cannot be defined for these subjects. Therefore, these subjects need to be removed because no comparable comparison groups can be found for such subjects. In this paper, using currently available causal inference methods, we evaluate the effect of arbitrary cutoffs in the distribution of propensity scores and the impact of those decisions on bias and efficiency. We propose a tree-based method that performs well in terms of bias reduction when the definition of positivity is based on a single confounder. This tree-based method can be easily implemented using the statistical software program, R. R code for the studies is available online.
Asymptotic computation of Greeks under a stochastic volatility model
Park, Sang-Hyeon ; Lee, Kiseop ;
Communications for Statistical Applications and Methods, volume 23, issue 1, 2016, Pages 21~32
DOI : 10.5351/CSAM.2016.23.1.021
We study asymptotic expansion formulae for numerical computation of Greeks (i.e. sensitivity) in finance. Our approach is based on the integration-by-parts formula of the Malliavin calculus. We propose asymptotic expansion of Greeks for a stochastic volatility model using the Greeks formula of the Black-Scholes model. A singular perturbation method is applied to derive asymptotic Greeks formulae. We also provide numerical simulation of our method and compare it to the Monte Carlo finite difference approach.
Variance components estimation in the presence of drift
Kim, Jaehee ; Ogden, Todd ;
Communications for Statistical Applications and Methods, volume 23, issue 1, 2016, Pages 33~45
DOI : 10.5351/CSAM.2016.23.1.033
Variance components should be estimated based on mean change when the mean of the observations drift gradually over time. Consistent estimators for the variance components are studied for a particular modeling situation with some underlying functions or drift. We propose a new variance estimator with Fourier estimation of variations. The consistency of the proposed estimator is proved asymptotically. The proposed procedures are studied and compared empirically with the variance estimators removing trends. The result shows that our variance estimator has a smaller mean square error and depends on drift patterns. We estimate and apply the variance to Nile River flow data and resting state fMRI data.
A composite estimator for stratified two stage cluster sampling
Lee, Sang Eun ; Lee, Pu Reum ; Shin, Key-Il ;
Communications for Statistical Applications and Methods, volume 23, issue 1, 2016, Pages 47~55
DOI : 10.5351/CSAM.2016.23.1.047
Stratified cluster sampling has been widely used for effective parameter estimations due to reductions in time and cost. The probability proportional to size (PPS) sampling method is used when the number of cluster element are significantly different. However, simple random sampling (SRS) is commonly used for simplicity if the number of cluster elements are almost the same. Also it is known that the ratio estimator produces a good performance when the total number of population elements is known. However, the two stage cluster estimator should be used if the total number of elements in population is neither known nor accurate. In this study we suggest a composite estimator by combining the ratio estimator and the two stage cluster estimator to obtain a better estimate under a certain population circumstance. Simulation studies are conducted to compare the superiority of the suggested estimator with two other estimators.
Type-II stepwise progressive censoring
Bayat, Mohammad ; Torabi, Hamzeh ;
Communications for Statistical Applications and Methods, volume 23, issue 1, 2016, Pages 57~70
DOI : 10.5351/CSAM.2016.23.1.057
Type-II progressive censoring is one of the censoring methods frequently used in clinical studies, reliability trials, quality control of products and industrial experiments. Sometimes in Type-II progressive censoring experiments, the failure rate is low so the waiting time to observe the
failure will be very long; however, the experimenter may have to terminate the experiment before a predetermined time. In this article, if two generalized types of Type-II progressive censoring are reminded, we then make some changes in the removal method of Type-II progressive censoring such that without reducing the deduction quality, the termination time of the experiment decreases. This can be done with decreasing withdraws throughout the steps of the experiment with a special reasonable method. A simulation study is done and the results are tabulated at the end of this article for a comparison between introduced method and Type-II progressive censoring.
A note on SVM estimators in RKHS for the deconvolution problem
Lee, Sungho ;
Communications for Statistical Applications and Methods, volume 23, issue 1, 2016, Pages 71~83
DOI : 10.5351/CSAM.2016.23.1.071
In this paper we discuss a deconvolution density estimator obtained using the support vector machines (SVM) and Tikhonov's regularization method solving ill-posed problems in reproducing kernel Hilbert space (RKHS). A remarkable property of SVM is that the SVM leads to sparse solutions, but the support vector deconvolution density estimator does not preserve sparsity as well as we expected. Thus, in section 3, we propose another support vector deconvolution estimator (method II) which leads to a very sparse solution. The performance of the deconvolution density estimators based on the support vector method is compared with the classical kernel deconvolution density estimator for important cases of Gaussian and Laplacian measurement error by means of a simulation study. In the case of Gaussian error, the proposed support vector deconvolution estimator shows the same performance as the classical kernel deconvolution density estimator.
Efficient simulation using saddlepoint approximation for aggregate losses with large frequencies
Cho, Jae-Rin ; Ha, Hyung-Tae ;
Communications for Statistical Applications and Methods, volume 23, issue 1, 2016, Pages 85~91
DOI : 10.5351/CSAM.2016.23.1.085
Aggregate claim amounts with a large claim frequency represent a major concern to automobile insurance companies. In this paper, we show that a new hybrid method to combine the analytical saddlepoint approximation and Monte Carlo simulation can be an efficient computational method. We provide numerical comparisons between the hybrid method and the usual Monte Carlo simulation.