In this paper, we propose a novel and highly effective variational Bayesian expectation maximization-maximization (VBEM-M) inference method for log-linear cognitive diagnostic model (CDM). In the implementation of the variational Bayesian approach for the saturated log-linear CDM, the conditional variational posteriors of the parameters that need to be derived are in the same distributional family as the priors, the VBEM-M algorithm overcomes this problem. Our algorithm can directly estimate the item parameters and the latent attribute-mastery pattern simultaneously. In contrast, Yamaguchi and Okada’s (2020a) variational Bayesian algorithm requires a transformation step to obtain the item parameters for the log-linear cognitive diagnostic model (LCDM). We conducted multiple simulation studies to assess the performance of the VBEM-M algorithm in terms of parameter recovery, execution time, and convergence rate. Furthermore, we conducted a series of comparative studies on the accuracy of parameter estimation for the DINA model and the saturated LCDM, focusing on the VBEM-M, VB, expectation-maximization, and Markov chain Monte Carlo algorithms. The results indicated that our method can obtain more stable and accurate estimates, especially for the small sample sizes. Finally, we demonstrated the utility of the proposed algorithm using two real datasets.