Suppr超能文献

高斯混合模型的期望最大化(EM)算法的渐近收敛速率

Asymptotic convergence rate of the EM algorithm for gaussian mixtures.

作者信息

Ma J, Xu L, Jordan MI

机构信息

Department of Computer Science & Engineering, The Chinese University of Hong Kong, Shatin Hong Kong and Institute of Mathematics, Shantou University, Shantou, Guangdong, 515063, People's Republic of China.

出版信息

Neural Comput. 2000 Dec;12(12):2881-907. doi: 10.1162/089976600300014764.

Abstract

It is well known that the convergence rate of the expectation-maximization (EM) algorithm can be faster than those of convention first-order iterative algorithms when the overlap in the given mixture is small. But this argument has not been mathematically proved yet. This article studies this problem asymptotically in the setting of gaussian mixtures under the theoretical framework of Xu and Jordan (1996). It has been proved that the asymptotic convergence rate of the EM algorithm for gaussian mixtures locally around the true solution Theta* is o(e(0. 5-epsilon)(Theta*)), where epsilon > 0 is an arbitrarily small number, o(x) means that it is a higher-order infinitesimal as x --> 0, and e(Theta*) is a measure of the average overlap of gaussians in the mixture. In other words, the large sample local convergence rate for the EM algorithm tends to be asymptotically superlinear when e(Theta*) tends to zero.

摘要

众所周知,当给定混合模型中的重叠部分较小时,期望最大化(EM)算法的收敛速度可能比传统的一阶迭代算法更快。但这一论点尚未得到数学证明。本文在Xu和Jordan(1996)的理论框架下,在高斯混合模型的设定中对该问题进行了渐近研究。已经证明,在真实解Theta附近,高斯混合模型的EM算法的渐近收敛速度为o(e(0. 5-epsilon)(Theta)),其中epsilon > 0是任意小的数,o(x)表示当x --> 0时它是高阶无穷小,并且e(Theta*)是混合模型中高斯分布平均重叠程度的一种度量。换句话说,当e(Theta*)趋于零时,EM算法的大样本局部收敛速度趋于渐近超线性。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验