Chu Jun-Uk, Lee Yun-Jung
School of Electrical Engineering and Computer Science, Kyungpook National University, Daegu 702-701, Korea.
IEEE Trans Neural Syst Rehabil Eng. 2009 Jun;17(3):287-97. doi: 10.1109/TNSRE.2009.2015177. Epub 2009 Feb 18.
This paper presents a new learning method for Gaussian mixture models (GMMs) to improve their generalization ability. A traditional maximum a posterior (MAP) parameter estimate is used to achieve regularization based on conjugate priors. Plus, a model order selection criterion is derived from Bayesian-Laplace approaches, using the conjugate priors to measure the uncertainty of the estimated parameters. As a result, the proposed learning method avoids the possibility of convergence toward the boundary of the parameter space, and is also capable of selecting the optimal order for a GMM with more enhanced stability than conventional methods using a flat prior. When applying the proposed learning method to construct a GMM classifier for electromyogram (EMG) pattern recognition, the proposed GMM classifier achieves a high generalization ability and outperforms conventional classifiers in terms of recognition accuracy.
本文提出了一种用于高斯混合模型(GMM)的新学习方法,以提高其泛化能力。采用传统的最大后验(MAP)参数估计基于共轭先验实现正则化。此外,基于贝叶斯 - 拉普拉斯方法推导了一种模型阶数选择准则,使用共轭先验来衡量估计参数的不确定性。结果,所提出的学习方法避免了收敛到参数空间边界的可能性,并且还能够为GMM选择最优阶数,与使用平坦先验的传统方法相比具有更高的稳定性。当将所提出的学习方法应用于构建用于肌电图(EMG)模式识别的GMM分类器时,所提出的GMM分类器具有较高的泛化能力,并且在识别准确率方面优于传统分类器。