Published online by Cambridge University Press: 01 January 2025
This paper proposes a novel collapsed Gibbs sampling algorithm that marginalizes model parameters and directly samples latent attribute mastery patterns in diagnostic classification models. This estimation method makes it possible to avoid boundary problems in the estimation of model item parameters by eliminating the need to estimate such parameters. A simulation study showed the collapsed Gibbs sampling algorithm can accurately recover the true attribute mastery status in various conditions. A second simulation showed the collapsed Gibbs sampling algorithm was computationally more efficient than another MCMC sampling algorithm, implemented by JAGS. In an analysis of real data, the collapsed Gibbs sampling algorithm indicated good classification agreement with results from a previous study.
Supplementary Information The online version contains supplementary material available at https://doi.org/10.1007/S0033312300005512a.