Suppr超能文献

基于信息半径的距离的詹森 - 香农对称化的变分定义

On a Variational Definition for the Jensen-Shannon Symmetrization of Distances Based on the Information Radius.

作者信息

Nielsen Frank

机构信息

Sony Computer Science Laboratories, Tokyo 141-0022, Japan.

出版信息

Entropy (Basel). 2021 Apr 14;23(4):464. doi: 10.3390/e23040464.

Abstract

We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson's information radius. The variational definition applies to any arbitrary distance and yields a new way to define a Jensen-Shannon symmetrization of distances. When the variational optimization is further constrained to belong to prescribed families of probability measures, we get relative Jensen-Shannon divergences and their equivalent Jensen-Shannon symmetrizations of distances that generalize the concept of information projections. Finally, we touch upon applications of these variational Jensen-Shannon divergences and diversity indices to clustering and quantization tasks of probability measures, including statistical mixtures.

摘要

我们通过考虑关于一般均值的变分定义来推广詹森 - 香农散度和詹森 - 香农多样性指数,从而扩展了西伯森信息半径的概念。该变分定义适用于任何任意距离,并给出了一种定义距离的詹森 - 香农对称化的新方法。当变分优化进一步被约束为属于规定的概率测度族时,我们得到了相对詹森 - 香农散度及其等效的距离詹森 - 香农对称化,它们推广了信息投影的概念。最后,我们探讨了这些变分詹森 - 香农散度和多样性指数在概率测度的聚类和量化任务(包括统计混合)中的应用。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3f83/8071043/873a922556c9/entropy-23-00464-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验