• Title/Summary/Keyword: Complete-case analysis

Search Result 364, Processing Time 0.029 seconds

Comparison of EM and Multiple Imputation Methods with Traditional Methods in Monotone Missing Pattern

  • Kang, Shin-Soo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.1
    • /
    • pp.95-106
    • /
    • 2005
  • Complete-case analysis is easy to carry out and it may be fine with small amount of missing data. However, this method is not recommended in general because the estimates are usually biased and not efficient. There are numerous alternatives to complete-case analysis. A natural alternative procedure is available-case analysis. Available-case analysis uses all cases that contain the variables required for a specific task. The EM algorithm is a general approach for computing maximum likelihood estimates of parameters from incomplete data. These methods and multiple imputation(MI) are reviewed and the performances are compared by simulation studies in monotone missing pattern.

  • PDF

Fully Efficient Fractional Imputation for Incomplete Contingency Tables

  • Kang, Shin-Soo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.4
    • /
    • pp.993-1002
    • /
    • 2004
  • Imputation procedures such as fully efficient fractional imputation(FEFI) or multiple imputation(MI) can be used to construct complete contingency tables from samples with partially classified responses. Variances of FEFI estimators of population proportions are derived. Simulation results, when data are missing completely at random, reveal that FEFI provides more efficient estimates of population than either multiple imputation(MI) based on data augmentation or complete case analysis, but neither FEFI nor MI provides an improvement over complete-case(CC) analysis with respect to accuracy of estimation of some parameters for association between two variables like $\theta_{i+}\theta_{+i}-\theta_{ij}$ and log odds-ratio.

  • PDF

MLE for Incomplete Contingency Tables with Lagrangian Multiplier

  • Kang, Shin-Soo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.3
    • /
    • pp.919-925
    • /
    • 2006
  • Maximum likelihood estimate(MLE) is obtained from the partial log-likelihood function for the cell probabilities of two way incomplete contingency tables proposed by Chen and Fienberg(1974). The partial log-likelihood function is modified by adding lagrangian multiplier that constraints can be incorporated with. Variances of MLE estimators of population proportions are derived from the matrix of second derivatives of the loglikelihood with respect to cell probabilities. Simulation results, when data are missing at random, reveal that Complete-case(CC) analysis produces biased estimates of joint probabilities under MAR and less efficient than either MLE or MI. MLE and MI provides consistent results under either the MAR situation. MLE provides more efficient estimates of population proportions than either multiple imputation(MI) based on data augmentation or complete case analysis. The standard errors of MLE from the proposed method using lagrangian multiplier are valid and have less variation than the standard errors from MI and CC.

  • PDF

Comparison of Five Single Imputation Methods in General Missing Pattern

  • Kang, Shin-Soo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.4
    • /
    • pp.945-955
    • /
    • 2004
  • 'Complete-case analysis' is easy to carry out and it may be fine with small amount of missing data. However, this method is not recommended in general because the estimates are usually biased and not efficient. There are numerous alternatives to complete-case analysis. One alternative is the single imputation. Some of the most common single imputation methods are reviewed and the performances are compared by simulation studies.

  • PDF

Estimation of Log-Odds Ratios for Incomplete $2{\times}2$ Tables with Covariates using FEFI

  • Kang, Shin-Soo;Bae, Je-Min
    • Journal of the Korean Data and Information Science Society
    • /
    • v.18 no.1
    • /
    • pp.185-194
    • /
    • 2007
  • The information of covariates are available to do fully efficient fractional imputation(FEFI). The new method, FEFI with logistic regression is proposed to construct complete contingency tables. Jackknife method is used to get a standard errors of log-odds ratio from the completed table by the new method. Simulation results, when covariates have more information about categorical variables, reveal that the new method provides more efficient estimates of log-odds ratio than either multiple imputation(MI) based on data augmentation or complete case analysis.

  • PDF

Evaluation of the influence of interface elements for structure - isolated footing - soil interaction analysis

  • Rajashekhar Swamy, H.M.;Krishnamoorthy, A.;Prabakhara, D.L.;Bhavikatti, S.S.
    • Interaction and multiscale mechanics
    • /
    • v.4 no.1
    • /
    • pp.65-83
    • /
    • 2011
  • In this study, two extreme cases of compatibility of the horizontal displacements between the foundation and soil are considered, for which the pressure and settlements of the isolated footings and member end actions in structural elements are obtained using the three dimensional models and numerical experiments. The first case considered is complete slip between foundation and soil, termed as the un-coupled analysis. In the second case of analysis, termed as the coupled analysis, complete welding is assumed of joints between the foundation and soil elements. The model and the corresponding computer program developed simulate these two extreme states of compatibility giving insight into the variation of horizontal displacements and horizontal stresses and their intricacies, for evaluation of the influence of using the interface elements in soil-structure interaction analysis of three dimensional multiscale structures supported by isolated footings.

PERFORMANCE EVALUATION OF INFORMATION CRITERIA FOR THE NAIVE-BAYES MODEL IN THE CASE OF LATENT CLASS ANALYSIS: A MONTE CARLO STUDY

  • Dias, Jose G.
    • Journal of the Korean Statistical Society
    • /
    • v.36 no.3
    • /
    • pp.435-445
    • /
    • 2007
  • This paper addresses for the first time the use of complete data information criteria in unsupervised learning of the Naive-Bayes model. A Monte Carlo study sets a large experimental design to assess these criteria, unusual in the Bayesian network literature. The simulation results show that complete data information criteria underperforms the Bayesian information criterion (BIC) for these Bayesian networks.

A validity study on SSI analysis by comparing the complete system model and the underground structure fixed-end model (연속체 모델과 지하구조물 고정단 모델의 비교를 통한 SSI 해석의 타당성 연구)

  • You, Kwang-Ho;Kim, Seung-Jin
    • Journal of Korean Tunnelling and Underground Space Association
    • /
    • v.20 no.5
    • /
    • pp.757-772
    • /
    • 2018
  • Recently, earthquakes have occurred in large cities such as Gyeongju and Pohang, and seismic analysis studies have been actively conducted in various fields. However, since most of the previous seismic analyses have dealt with ground structures and the ground separately, there is a lack of a study on the complete soil-structure dynamic interaction. Therefore, in this study, a sensitivity analysis is conducted with MIDAS GEN and MIDAS GTS NX to apply the underground structure fixed-end model considering only the building and the complete system model considering both the building and the ground, respectively and the validity of dynamic analysis considering SSI is examined. As a result of the study, in most conditions it is found that the maximum horizontal displacement of the tall building in case of the underground structure fixed-end model is estimated to be smaller, the bending stress is larger, and the range of the weak part is smaller than that of the complete system model. Therefore, it is expected to be more reasonable to use the complete system model considering soil-structure interaction in seismic analysis.

Radiographic analysis of the management of tooth extractions in head and neck-irradiated patients: a case series

  • Oliveira, Samanta V.;Vellei, Renata S.;Heguedusch, Daniele;Domaneschi, Carina;Costa, Claudio;Gallo, Camila de Barros
    • Imaging Science in Dentistry
    • /
    • v.51 no.3
    • /
    • pp.323-328
    • /
    • 2021
  • Tooth extraction after head and neck radiotherapy exposes patients to an increased risk for osteoradionecrosis of the jaw. This study reports the results of a radiographic analysis of bone neoformation after tooth extraction in a case series of patients who underwent radiation therapy. No patients developed osteoradionecrosis within a follow-up of 1 year. Complete mucosal repair was observed 30 days after surgery, while no sign of bone formation was observed 2 months after the dental extractions. Pixel intensity and fractal dimension image analyses only showed significant bone formation 12 months after the tooth extractions. These surgical procedures must follow a strict protocol that includes antibiotic prophylaxis and therapy and complete wound closure, since bone formation at the alveolar socket occurs at a slower pace in patients who have undergone head and neck radiotherapy.

A Study on Approximate and Exact Algorithms to Minimize Makespan on Parallel Processors (竝列處理機械상에서 總作業完了時間의 最小化解法에 관한 硏究)

  • Ahn, Sang-Hyung;Lee, Song-Kun
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.16 no.2
    • /
    • pp.14-35
    • /
    • 1991
  • The purpose of this study is to develop an efficient exact algorithm for the problem of scheduling n in dependent jobs on m unequal parallel processors to minimize makespan. Efficient solutions are already known for the preemptive case. But for the non-preemptive case, this problem belongs to a set of strong NP-complete problems. Hence, it is unlikely that the polynomial time algorithm can be found. This is the reason why most investigations have bben directed toward the fast approximate algorithms and the worst-case analysis of algorithms. Recently, great advances have been made in mathematical theories regarding Lagrangean relaxation and the subgradient optimization procedure which updates the Lagrangean multipliers. By combining and the subgradient optimization procedure which updates the Lagrangean multipliers. By combining these mathematical tools with branch-and-bound procedures, these have been some successes in constructing pseudo-polynomial time algorithms for solving previously unsolved NP-complete problems. This study applied similar methodologies to the unequal parallel processor problem to find the efficient exact algorithm.

  • PDF