IEEE Trans Neural Netw Learn Syst. 2013 Jul;24(7):1161-6. doi: 10.1109/TNNLS.2013.2249086.
Gaussian mixture models (GMMs) and multilayer perceptron (MLP) are both popular pattern classification techniques. This brief shows that a multilayer perceptron with quadratic inputs (MLPQ) can accurately approximate GMMs with diagonal covariance matrices. The mapping equations between the parameters of GMM and the weights of MLPQ are presented. A similar approach is applied to radial basis function networks (RBFNs) to show that RBFNs with Gaussian basis functions and Euclidean norm can be approximated accurately with MLPQ. The mapping equations between RBFN and MLPQ weights are presented. There are well-established training procedures for GMMs, such as the expectation maximization (EM) algorithm. The GMM parameters obtained by the EM algorithm can be used to generate a set of initial weights of MLPQ. Similarly, a trained RBFN can be used to generate a set of initial weights of MLPQ. MLPQ training can be continued further with gradient-descent based methods, which can lead to improvement in performance compared to the GMM or RBFN from which it is initialized. Thus, the MLPQ can always perform as well as or better than the GMM or RBFN.
高斯混合模型(GMM)和多层感知机(MLP)都是流行的模式分类技术。本简要介绍了具有二次输入的多层感知机(MLPQ)可以准确逼近具有对角协方差矩阵的 GMM。给出了 GMM 与 MLPQ 权重参数之间的映射方程。类似的方法也应用于径向基函数网络(RBFN),以表明具有高斯基函数和欧几里得范数的 RBFN 可以用 MLPQ 进行准确逼近。给出了 RBFN 和 MLPQ 权重之间的映射方程。GMM 有成熟的训练程序,例如期望最大化(EM)算法。通过 EM 算法获得的 GMM 参数可用于生成 MLPQ 的一组初始权重。类似地,经过训练的 RBFN 可以用来生成 MLPQ 的一组初始权重。可以使用基于梯度下降的方法进一步进行 MLPQ 训练,与初始化的 GMM 或 RBFN 相比,这可以提高性能。因此,MLPQ 的性能始终可以与 GMM 或 RBFN 一样好或更好。