Weissman Alexander
Psychometric Research, Law School Admission Council, 662 Penn Street, Box 40, Newtown, PA, 18940, USA,
Psychometrika. 2013 Jan;78(1):134-53. doi: 10.1007/s11336-012-9295-z. Epub 2012 Oct 23.
Convergence of the expectation-maximization (EM) algorithm to a global optimum of the marginal log likelihood function for unconstrained latent variable models with categorical indicators is presented. The sufficient conditions under which global convergence of the EM algorithm is attainable are provided in an information-theoretic context by interpreting the EM algorithm as alternating minimization of the Kullback-Leibler divergence between two convex sets. It is shown that these conditions are satisfied by an unconstrained latent class model, yielding an optimal bound against which more highly constrained models may be compared.
本文给出了期望最大化(EM)算法对于具有分类指标的无约束潜在变量模型收敛到边际对数似然函数全局最优值的情况。通过将EM算法解释为两个凸集之间Kullback-Leibler散度的交替最小化,在信息论背景下提供了EM算法可实现全局收敛的充分条件。结果表明,无约束潜在类别模型满足这些条件,从而产生了一个最优界,可用于比较约束更强的模型。