• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过将混合转换为指数多项式分布对单变量高斯混合之间的杰弗里斯散度进行快速近似

Fast Approximations of the Jeffreys Divergence between Univariate Gaussian Mixtures via Mixture Conversions to Exponential-Polynomial Distributions.

作者信息

Nielsen Frank

机构信息

Sony Computer Science Laboratories, Tokyo 141-0022, Japan.

出版信息

Entropy (Basel). 2021 Oct 28;23(11):1417. doi: 10.3390/e23111417.

DOI:10.3390/e23111417
PMID:34828115
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8619509/
Abstract

The Jeffreys divergence is a renown arithmetic symmetrization of the oriented Kullback-Leibler divergence broadly used in information sciences. Since the Jeffreys divergence between Gaussian mixture models is not available in closed-form, various techniques with advantages and disadvantages have been proposed in the literature to either estimate, approximate, or lower and upper bound this divergence. In this paper, we propose a simple yet fast heuristic to approximate the Jeffreys divergence between two univariate Gaussian mixtures with arbitrary number of components. Our heuristic relies on converting the mixtures into pairs of dually parameterized probability densities belonging to an exponential-polynomial family. To measure with a closed-form formula the goodness of fit between a Gaussian mixture and an exponential-polynomial density approximating it, we generalize the Hyvärinen divergence to α-Hyvärinen divergences. In particular, the 2-Hyvärinen divergence allows us to perform model selection by choosing the order of the exponential-polynomial densities used to approximate the mixtures. We experimentally demonstrate that our heuristic to approximate the Jeffreys divergence between mixtures improves over the computational time of stochastic Monte Carlo estimations by several orders of magnitude while approximating the Jeffreys divergence reasonably well, especially when the mixtures have a very small number of modes.

摘要

杰弗里斯散度是信息科学中广泛使用的有向库尔贝克 - 莱布勒散度的一种著名算术对称化形式。由于高斯混合模型之间的杰弗里斯散度没有闭式解,文献中提出了各种优缺点各异的技术来估计、近似或界定这种散度的上下界。在本文中,我们提出了一种简单而快速的启发式方法,用于近似两个具有任意数量分量的单变量高斯混合模型之间的杰弗里斯散度。我们的启发式方法依赖于将混合模型转换为属于指数多项式族的对偶参数化概率密度对。为了用闭式公式衡量高斯混合模型与近似它的指数多项式密度之间的拟合优度,我们将海瓦林散度推广到α - 海瓦林散度。特别地,2 - 海瓦林散度使我们能够通过选择用于近似混合模型的指数多项式密度的阶数来进行模型选择。我们通过实验证明,我们用于近似混合模型之间杰弗里斯散度的启发式方法在计算时间上比随机蒙特卡罗估计提高了几个数量级,同时能较好地近似杰弗里斯散度,尤其是当混合模型的模态数量非常少时。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/7dc7e67d44cc/entropy-23-01417-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/4be60a5682d0/entropy-23-01417-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/0ee4b0eae391/entropy-23-01417-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/391e451e08e0/entropy-23-01417-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/23399b2ed757/entropy-23-01417-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/93f8b2aaf0cd/entropy-23-01417-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/640566051a9b/entropy-23-01417-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/7d1614ff00ae/entropy-23-01417-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/0ae4696b2ee2/entropy-23-01417-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/7dc7e67d44cc/entropy-23-01417-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/4be60a5682d0/entropy-23-01417-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/0ee4b0eae391/entropy-23-01417-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/391e451e08e0/entropy-23-01417-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/23399b2ed757/entropy-23-01417-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/93f8b2aaf0cd/entropy-23-01417-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/640566051a9b/entropy-23-01417-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/7d1614ff00ae/entropy-23-01417-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/0ae4696b2ee2/entropy-23-01417-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/10bf/8619509/7dc7e67d44cc/entropy-23-01417-g009.jpg

相似文献

1
Fast Approximations of the Jeffreys Divergence between Univariate Gaussian Mixtures via Mixture Conversions to Exponential-Polynomial Distributions.通过将混合转换为指数多项式分布对单变量高斯混合之间的杰弗里斯散度进行快速近似
Entropy (Basel). 2021 Oct 28;23(11):1417. doi: 10.3390/e23111417.
2
On the Jensen-Shannon Symmetrization of Distances Relying on Abstract Means.关于基于抽象均值的距离的詹森 - 香农对称化
Entropy (Basel). 2019 May 11;21(5):485. doi: 10.3390/e21050485.
3
Revisiting Chernoff Information with Likelihood Ratio Exponential Families.用似然比指数族重新审视切尔诺夫信息。
Entropy (Basel). 2022 Oct 1;24(10):1400. doi: 10.3390/e24101400.
4
Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences.具有嵌套支撑的截断指数族密度之间的统计散度:对偶布雷格曼散度和对偶詹森散度。
Entropy (Basel). 2022 Mar 17;24(3):421. doi: 10.3390/e24030421.
5
On a Generalization of the Jensen-Shannon Divergence and the Jensen-Shannon Centroid.关于詹森 - 香农散度与詹森 - 香农质心的一种推广
Entropy (Basel). 2020 Feb 16;22(2):221. doi: 10.3390/e22020221.
6
Discrete Versions of Jensen-Fisher, Fisher and Bayes-Fisher Information Measures of Finite Mixture Distributions.有限混合分布的詹森 - 费希尔、费希尔和贝叶斯 - 费希尔信息度量的离散版本
Entropy (Basel). 2021 Mar 18;23(3):363. doi: 10.3390/e23030363.
7
Integration of stochastic models by minimizing alpha-divergence.通过最小化α-散度对随机模型进行整合。
Neural Comput. 2007 Oct;19(10):2780-96. doi: 10.1162/neco.2007.19.10.2780.
8
Jeffreys Divergence and Generalized Fisher Information Measures on Fokker-Planck Space-Time Random Field.福克-普朗克时空随机场上的杰弗里斯散度和广义费希尔信息度量
Entropy (Basel). 2023 Oct 13;25(10):1445. doi: 10.3390/e25101445.
9
Inequalities for Jensen-Sharma-Mittal and Jeffreys-Sharma-Mittal Type -Divergences.关于詹森 - 沙玛 - 米塔尔型和杰弗里斯 - 沙玛 - 米塔尔型散度的不等式。
Entropy (Basel). 2021 Dec 16;23(12):1688. doi: 10.3390/e23121688.
10
Unlocking the potential of LSTM for accurate salary prediction with MLE, Jeffreys prior, and advanced risk functions.利用最大似然估计、杰弗里斯先验和高级风险函数释放长短期记忆网络在准确薪资预测方面的潜力。
PeerJ Comput Sci. 2024 Feb 22;10:e1875. doi: 10.7717/peerj-cs.1875. eCollection 2024.

引用本文的文献

1
Central Role of Hypertension in HIV Comorbidity Networks: A Population-Based Study of Age and Sex-Specific Patterns in Southwest China.高血压在HIV合并症网络中的核心作用:基于中国西南地区年龄和性别特异性模式的人群研究
J Am Heart Assoc. 2025 May 20;14(10):e040634. doi: 10.1161/JAHA.124.040634. Epub 2025 May 13.
2
Relative Entropy Application to Study the Elastoplastic Behavior of S235JR Structural Steel.相对熵在研究S235JR结构钢弹塑性行为中的应用
Materials (Basel). 2024 Feb 3;17(3):727. doi: 10.3390/ma17030727.
3
Probabilistic Relative Entropy in Homogenization of Fibrous Metal Matrix Composites (MMCs).

本文引用的文献

1
Shape retrieval using hierarchical total Bregman soft clustering.基于分层总 Bregman 软聚类的形状检索。
IEEE Trans Pattern Anal Mach Intell. 2012 Dec;34(12):2407-19. doi: 10.1109/TPAMI.2012.44.
2
Rayleigh mixture model for plaque characterization in intravascular ultrasound.血管内超声斑块特征的瑞利混合模型。
IEEE Trans Biomed Eng. 2011 May;58(5):1314-24. doi: 10.1109/TBME.2011.2106498. Epub 2011 Jan 17.
3
An invariant form for the prior probability in estimation problems.估计问题中先验概率的一种不变形式。
纤维金属基复合材料(MMCs)均匀化中的概率相对熵
Materials (Basel). 2023 Sep 7;16(18):6112. doi: 10.3390/ma16186112.
4
Distance in Information and Statistical Physics III.信息与统计物理学中的距离III
Entropy (Basel). 2023 Jan 5;25(1):110. doi: 10.3390/e25010110.
Proc R Soc Lond A Math Phys Sci. 1946;186(1007):453-61. doi: 10.1098/rspa.1946.0056.