Suppr超能文献

高斯混合模型的平均熵

Average Entropy of Gaussian Mixtures.

作者信息

Joudeh Basheer, Škorić Boris

机构信息

Department of Computer Science and Mathematics, Eindhoven University of Technology, 5612 AZ Eindhoven, The Netherlands.

出版信息

Entropy (Basel). 2024 Aug 1;26(8):659. doi: 10.3390/e26080659.

Abstract

We calculate the average differential entropy of a -component Gaussian mixture in Rn. For simplicity, all components have covariance matrix σ21, while the means {Wi}i=1q are i.i.d. Gaussian vectors with zero mean and covariance s21. We obtain a series expansion in μ=s2/σ2 for the average differential entropy up to order O(μ2), and we provide a recipe to calculate higher-order terms. Our result provides an analytic approximation with a quantifiable order of magnitude for the error, which is not achieved in previous literature.

摘要

我们计算了(n)维空间中(q)分量高斯混合的平均微分熵。为简单起见,所有分量的协方差矩阵均为(\sigma^{2}I),而均值({\mathbf{w}i}{i = 1}^q)是独立同分布的高斯向量,均值为零,协方差为(s^{2}I)。我们得到了平均微分熵在(\mu = s^{2}/\sigma^{2})下直至(O(\mu^{2}))阶的级数展开式,并给出了计算高阶项的方法。我们的结果提供了一种具有可量化误差量级的解析近似,这是以往文献中未实现的。

相似文献

1
Average Entropy of Gaussian Mixtures.高斯混合模型的平均熵
Entropy (Basel). 2024 Aug 1;26(8):659. doi: 10.3390/e26080659.
9
Learning Gaussian mixture models with entropy-based criteria.使用基于熵的准则学习高斯混合模型。
IEEE Trans Neural Netw. 2009 Nov;20(11):1756-71. doi: 10.1109/TNN.2009.2030190. Epub 2009 Sep 18.
10
Effects of correlated variability on information entropies in nonextensive systems.关联涨落在非广延系统中对信息熵的影响。
Phys Rev E Stat Nonlin Soft Matter Phys. 2008 Aug;78(2 Pt 1):021141. doi: 10.1103/PhysRevE.78.021141. Epub 2008 Aug 28.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验