The inference and estimation for latent discrete outcomes with a small sample Choi, Hyung; Chung, Hwan;
In research on behavioral studies, significant attention has been paid to the stage-sequential process for longitudinal data. Latent class profile analysis (LCPA) is an useful method to study sequential patterns of the behavioral development by the two-step identification process: identifying a small number of latent classes at each measurement occasion and two or more homogeneous subgroups in which individuals exhibit a similar sequence of latent class membership over time. Maximum likelihood (ML) estimates for LCPA are easily obtained by expectation-maximization (EM) algorithm, and Bayesian inference can be implemented via Markov chain Monte Carlo (MCMC). However, unusual properties in the likelihood of LCPA can cause difficulties in ML and Bayesian inference as well as estimation in small samples. This article describes and addresses erratic problems that involve conventional ML and Bayesian estimates for LCPA with small samples. We argue that these problems can be alleviated with a small amount of prior input. This study evaluates the performance of likelihood and MCMC-based estimates with the proposed prior in drawing inference over repeated sampling. Our simulation shows that estimates from the proposed methods perform better than those from the conventional ML and Bayesian method.
Chung H and Anthony JC (2013). A Bayesian approach to a multiple-group latent class-profile analysis: the timing of drinking onset and subsequent drinking behaviors among US adolescents, Structural Equation Modeling: A Multidisciplinary Journal, 20, 658-680.
Chung H, Anthony JC, and Schafer JL (2011). Latent class profile analysis: an application to stage sequential process of early onset drinking behaviours, Journal of the Royal Statistical Society Series A, 174, 689-712.
Chung H and Chang HC (2012). Bayesian approaches to the model selection problem in the analysis of latent stage-sequential process, Computational Statistics and Data Analysis, 56, 4097-4110.
Chung H, Loken E, and Schafer JL (2004). Difficulties in drawing inferences with finite-mixture models: a simple example with a simple solution, The American Statistician, 58, 152-158.
Dempster AP, Laird NM, and Rubin DB (1977). Maximum likelihood from incomplete data via the EM algorithm, Journal of the Royal Statistical Society Series B, 39, 1-38.
Gelfand AE and Smith AFM (1990). Sampling-based approaches to calculating marginal densities, Journal of the American Statistical Association, 85, 398-409.
Gelman A and Rubin DB (1992). Inference from iterative simulation using multiple sequences, Statistical Science, 7, 457-472.
Geweke J (1992). Evaluating the accuracy of sampling-based approaches to calculating posterior moments, Bayesian Statistics, 4, 169-193,
Goodman LA (1974). Exploratory latent structure analysis using both identifiable and unidentifiable models, Biometrika, 61, 215-231.
Lazarsfeld PF and Henry NW (1968). Latent Structure Analysis, Houghton Mifflin, Boston.
Robert GO (1992). Convergence diagnostics of the Gibbs sampler, Bayesian Statistics, 4, 775-782.
Rubin DB and Schenker N (1987). Logit-based interval estimation for binomial data, Sociological Methodology, 17, 131-144.
Tanner WA and Wong WH (1987). The calculation of posterior distributions by data augmentation, Journal of the American Statistical Association, 82, 528-550.
Tierney L (1994). Markov chains for exploring posterior distributions (with discussion), Annals of Statistics, 22, 1701-1762.