Suppr超能文献

用多层感知机逼近高斯混合模型或径向基函数网络。

Approximating Gaussian mixture model or radial basis function network with multilayer perceptron.

出版信息

IEEE Trans Neural Netw Learn Syst. 2013 Jul;24(7):1161-6. doi: 10.1109/TNNLS.2013.2249086.

Abstract

Gaussian mixture models (GMMs) and multilayer perceptron (MLP) are both popular pattern classification techniques. This brief shows that a multilayer perceptron with quadratic inputs (MLPQ) can accurately approximate GMMs with diagonal covariance matrices. The mapping equations between the parameters of GMM and the weights of MLPQ are presented. A similar approach is applied to radial basis function networks (RBFNs) to show that RBFNs with Gaussian basis functions and Euclidean norm can be approximated accurately with MLPQ. The mapping equations between RBFN and MLPQ weights are presented. There are well-established training procedures for GMMs, such as the expectation maximization (EM) algorithm. The GMM parameters obtained by the EM algorithm can be used to generate a set of initial weights of MLPQ. Similarly, a trained RBFN can be used to generate a set of initial weights of MLPQ. MLPQ training can be continued further with gradient-descent based methods, which can lead to improvement in performance compared to the GMM or RBFN from which it is initialized. Thus, the MLPQ can always perform as well as or better than the GMM or RBFN.

摘要

高斯混合模型(GMM)和多层感知机(MLP)都是流行的模式分类技术。本简要介绍了具有二次输入的多层感知机(MLPQ)可以准确逼近具有对角协方差矩阵的 GMM。给出了 GMM 与 MLPQ 权重参数之间的映射方程。类似的方法也应用于径向基函数网络(RBFN),以表明具有高斯基函数和欧几里得范数的 RBFN 可以用 MLPQ 进行准确逼近。给出了 RBFN 和 MLPQ 权重之间的映射方程。GMM 有成熟的训练程序,例如期望最大化(EM)算法。通过 EM 算法获得的 GMM 参数可用于生成 MLPQ 的一组初始权重。类似地,经过训练的 RBFN 可以用来生成 MLPQ 的一组初始权重。可以使用基于梯度下降的方法进一步进行 MLPQ 训练,与初始化的 GMM 或 RBFN 相比,这可以提高性能。因此,MLPQ 的性能始终可以与 GMM 或 RBFN 一样好或更好。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验