Suppr超能文献

熵/负熵的二元性以及库尔贝克信息复合体的完备性。

The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex.

作者信息

Lad Frank, Sanfilippo Giuseppe, Agrò Gianna

机构信息

Department of Mathematics and Statistics, University of Canterbury, 8140 Christchurch, New Zealand.

Department of Mathematics and Computer Science, University of Palermo, 90123 Palermo, Italy.

出版信息

Entropy (Basel). 2018 Aug 9;20(8):593. doi: 10.3390/e20080593.

Abstract

The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery that the entropy measure of a probability distribution has a dual measure, a complementary companion designated as "extropy". We report here the main results that identify this fact, specifying the dual equations and exhibiting some of their structure. The duality extends beyond a simple assessment of entropy, to the formulation of relative entropy and the Kullback symmetric distance between two forecasting distributions. This is defined by the sum of a pair of directed divergences. Examining the defining equation, we notice that this symmetric measure can be generated by two other explicable pairs of functions as well, neither of which is a Bregman divergence. The Kullback information complex is constituted by the symmetric measure of entropy/extropy along with one of each of these three function pairs. It is intimately related to the total logarithmic score of two distinct forecasting distributions for a quantity under consideration, this being a complete proper score. The information complex is isomorphic to the expectations that the two forecasting distributions assess for their achieved scores, each for its own score and for the score achieved by the other. Analysis of the scoring problem exposes a Pareto optimal exchange of the forecasters' scores that both are willing to engage. Both would support its evaluation for assessing the relative quality of the information they provide regarding the observation of an unknown quantity of interest. We present our results without proofs, as these appear in source articles that are referenced. The focus here is on their content, unhindered. The mathematical syntax of probability we employ relies upon the operational subjective constructions of Bruno de Finetti.

摘要

熵的精细化公理在信息论基础方面颇具启发性,在香农和杰恩斯的著作中都被认为是值得思考的。最近发现概率分布的熵度量有一个对偶度量,即被指定为“信息熵”的互补伴随量,这为他们的担忧提供了解决方案。我们在此报告确定这一事实的主要结果,具体给出对偶方程并展示其部分结构。这种对偶性不仅扩展到熵的简单评估,还扩展到相对熵的公式以及两个预测分布之间的库尔贝克对称距离。这由一对有向散度之和定义。研究定义方程时,我们注意到这个对称度量也可以由另外两对可解释的函数生成,其中任何一对都不是布雷格曼散度。库尔贝克信息复合体由熵/信息熵的对称度量以及这三对函数中的每一对中的一个组成。它与针对所考虑数量的两个不同预测分布的总对数得分密切相关,这是一个完全恰当得分。信息复合体与两个预测分布对其各自实现得分以及对方实现得分的期望同构。对评分问题的分析揭示了预测者得分的帕累托最优交换,双方都愿意参与。双方都支持对其进行评估,以评估他们提供的关于未知感兴趣数量观测信息的相对质量。我们在此不给出证明地呈现我们的结果,因为这些证明出现在参考文献中的原始文章中。这里的重点是其内容,不受阻碍。我们使用的概率数学语法依赖于布鲁诺·德·菲内蒂的操作主观构造。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd08/7513105/647a151b50a8/entropy-20-00593-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验