• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

场的信息测度近似

Approximating Information Measures for Fields.

作者信息

Dębowski Łukasz

机构信息

Institute of Computer Science, Polish Academy of Sciences, ul. Jana Kazimierza 5, 01-248 Warszawa, Poland.

出版信息

Entropy (Basel). 2020 Jan 9;22(1):79. doi: 10.3390/e22010079.

DOI:10.3390/e22010079
PMID:33285857
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7516512/
Abstract

We supply corrected proofs of the invariance of completion and the chain rule for the Shannon information measures of arbitrary fields, as stated by Dębowski in 2009. Our corrected proofs rest on a number of auxiliary approximation results for Shannon information measures, which may be of an independent interest. As also discussed briefly in this article, the generalized calculus of Shannon information measures for fields, including the invariance of completion and the chain rule, is useful in particular for studying the ergodic decomposition of stationary processes and its links with statistical modeling of natural language.

摘要

我们给出了完备性不变性的修正证明以及任意场的香农信息测度的链式法则,正如德博夫斯基在2009年所阐述的那样。我们的修正证明基于一些关于香农信息测度的辅助近似结果,这些结果可能具有独立的研究价值。正如本文中也简要讨论的那样,场的香农信息测度的广义微积分,包括完备性不变性和链式法则,特别有助于研究平稳过程的遍历分解及其与自然语言统计建模的联系。

相似文献

1
Approximating Information Measures for Fields.场的信息测度近似
Entropy (Basel). 2020 Jan 9;22(1):79. doi: 10.3390/e22010079.
2
Statistical Characteristics of Stationary Flow of Substance in a Network Channel Containing Arbitrary Number of Arms.包含任意数量分支的网络通道中物质稳定流动的统计特性。
Entropy (Basel). 2020 May 15;22(5):553. doi: 10.3390/e22050553.
3
Low Complexity Estimation Method of Rényi Entropy for Ergodic Sources.遍历源的雷尼熵低复杂度估计方法
Entropy (Basel). 2018 Aug 31;20(9):657. doi: 10.3390/e20090657.
4
Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding.离散变量的香农互信息近似及其在神经群体编码中的应用
Entropy (Basel). 2019 Mar 4;21(3):243. doi: 10.3390/e21030243.
5
Ergodicity and Born's rule in an entangled two-qubit Bohmian system.纠缠双量子比特玻姆系统中的遍历性与玻恩规则
Phys Rev E. 2020 Oct;102(4-1):042205. doi: 10.1103/PhysRevE.102.042205.
6
Expected Shannon Entropy and Shannon Differentiation between Subpopulations for Neutral Genes under the Finite Island Model.有限岛屿模型下中性基因亚群间的期望香农熵和香农差异
PLoS One. 2015 Jun 11;10(6):e0125471. doi: 10.1371/journal.pone.0125471. eCollection 2015.
7
Entropy of partially polarized light and application to statistical processing techniques.部分偏振光的熵及其在统计处理技术中的应用。
J Opt Soc Am A Opt Image Sci Vis. 2004 Nov;21(11):2124-34. doi: 10.1364/josaa.21.002124.
8
Scale invariance of incident size distributions in response to sizes of their causes.响应其成因大小的入射大小分布的尺度不变性。
Risk Anal. 2002 Apr;22(2):369-81. doi: 10.1111/0272-4332.00016.
9
Inferring species richness and turnover by statistical multiresolution texture analysis of satellite imagery.基于卫星图像统计多分辨率纹理分析推断物种丰富度和周转率。
PLoS One. 2012;7(10):e46616. doi: 10.1371/journal.pone.0046616. Epub 2012 Oct 24.
10
An Axiomatic Characterization of Mutual Information.互信息的公理特征描述。
Entropy (Basel). 2023 Apr 15;25(4):663. doi: 10.3390/e25040663.

引用本文的文献

1
A Refutation of Finite-State Language Models through Zipf's Law for Factual Knowledge.通过齐普夫定律对事实性知识的有限状态语言模型的反驳。
Entropy (Basel). 2021 Sep 1;23(9):1148. doi: 10.3390/e23091148.
2
Using the Semantic Information G Measure to Explain and Extend Rate-Distortion Functions and Maximum Entropy Distributions.
Entropy (Basel). 2021 Aug 15;23(8):1050. doi: 10.3390/e23081050.
3
Information Theory and Language.信息论与语言
Entropy (Basel). 2020 Apr 11;22(4):435. doi: 10.3390/e22040435.

本文引用的文献

1
Estimating Predictive Rate-Distortion Curves via Neural Variational Inference.通过神经变分推理估计预测率失真曲线。
Entropy (Basel). 2019 Jun 28;21(7):640. doi: 10.3390/e21070640.
2
Is Natural Language a Perigraphic Process? The Theorem about Facts and Words Revisited.自然语言是一种周边书写过程吗?关于事实与文字的定理再探讨。
Entropy (Basel). 2018 Jan 26;20(2):85. doi: 10.3390/e20020085.
3
Signatures of infinity: Nonergodicity and resource scaling in prediction, complexity, and learning.无穷的特征:预测、复杂性和学习中的非遍历性与资源缩放
Phys Rev E Stat Nonlin Soft Matter Phys. 2015 May;91(5):050106. doi: 10.1103/PhysRevE.91.050106. Epub 2015 May 27.
4
Proof of the Ergodic Theorem.遍历定理的证明。
Proc Natl Acad Sci U S A. 1931 Dec;17(12):656-60. doi: 10.1073/pnas.17.2.656.
5
Regularities unseen, randomness observed: levels of entropy convergence.规律不可见,随机性可见:熵收敛水平。
Chaos. 2003 Mar;13(1):25-54. doi: 10.1063/1.1530990.