• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于收敛结果的无穷字母表中香农熵估计:研究插入式估计器

Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators.

作者信息

Silva Jorge F

机构信息

Information and Decision System Group, Department of Electrical Engineering, Universidad de Chile, Av. Tupper 2007, Santiago 7591538, Chile.

出版信息

Entropy (Basel). 2018 May 23;20(6):397. doi: 10.3390/e20060397.

DOI:10.3390/e20060397
PMID:33265487
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7512916/
Abstract

This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studying and adopting some recent convergence results of the entropy functional, which is known to be a discontinuous function in the space of probabilities in ∞-alphabets. Sufficient conditions for the convergence of the entropy are used in conjunction with some deviation inequalities (including scenarios with both finitely and infinitely supported assumptions on the target distribution). From this perspective, four plug-in histogram-based estimators are studied showing that convergence results are instrumental to derive new strong consistent estimators for the entropy. The main application of this methodology is a new data-driven partition (plug-in) estimator. This scheme uses the data to restrict the support where the distribution is estimated by finding an optimal balance between estimation and approximation errors. The proposed scheme offers a consistent (distribution-free) estimator of the entropy in ∞-alphabets and optimal rates of convergence under certain regularity conditions on the problem (finite and unknown supported assumptions and tail bounded conditions on the target distribution).

摘要

这项工作研究并采用了熵泛函的一些最新收敛结果,解决了可数无限字母表中香农熵估计的问题,已知该熵泛函在无穷字母表的概率空间中是一个不连续函数。熵收敛的充分条件与一些偏差不等式(包括对目标分布有限支持和无限支持假设的情况)结合使用。从这个角度出发,研究了四种基于直方图的插件估计器,表明收敛结果有助于推导新的熵强一致估计器。该方法的主要应用是一种新的数据驱动划分(插件)估计器。该方案利用数据来限制估计分布的支持域,通过在估计误差和近似误差之间找到最佳平衡。所提出的方案在无穷字母表中提供了一个一致的(无分布)熵估计器,并在问题的某些正则性条件下(有限且未知的支持假设以及目标分布的尾部有界条件)给出了最优收敛速率。

相似文献

1
Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators.基于收敛结果的无穷字母表中香农熵估计:研究插入式估计器
Entropy (Basel). 2018 May 23;20(6):397. doi: 10.3390/e20060397.
2
Normal Laws for Two Entropy Estimators on Infinite Alphabets.无限字母表上两种熵估计量的正态定律
Entropy (Basel). 2018 May 17;20(5):371. doi: 10.3390/e20050371.
3
Asymptotic Normality for Plug-In Estimators of Generalized Shannon's Entropy.广义香农熵插件估计量的渐近正态性。
Entropy (Basel). 2022 May 12;24(5):683. doi: 10.3390/e24050683.
4
Entropy estimation in Turing's perspective.图灵视角下的熵估计。
Neural Comput. 2012 May;24(5):1368-89. doi: 10.1162/NECO_a_00266. Epub 2012 Feb 1.
5
Asymptotic distribution of sample Shannon entropy in the case of an underlying finite, regular Markov chain.基础有限正则马尔可夫链情形下样本香农熵的渐近分布
Phys Rev E. 2021 Feb;103(2-1):022215. doi: 10.1103/PhysRevE.103.022215.
6
Minimax Estimation of Functionals of Discrete Distributions.离散分布泛函的极小极大估计
IEEE Trans Inf Theory. 2015 May;61(5):2835-2885. doi: 10.1109/tit.2015.2412945. Epub 2015 Mar 13.
7
A Comparative Analysis of Discrete Entropy Estimators for Large-Alphabet Problems.针对大字母表问题的离散熵估计器的比较分析
Entropy (Basel). 2024 Apr 28;26(5):369. doi: 10.3390/e26050369.
8
Information estimators for weighted observations.加权观测的信息估计量。
Neural Netw. 2013 Oct;46:260-75. doi: 10.1016/j.neunet.2013.06.005. Epub 2013 Jun 24.
9
Kolmogorov Entropy for Convergence Rate in Incomplete Functional Time Series: Application to Percentile and Cumulative Estimation in High Dimensional Data.不完全函数时间序列收敛速率的柯尔莫哥洛夫熵:在高维数据百分位数和累积估计中的应用
Entropy (Basel). 2023 Jul 24;25(7):1108. doi: 10.3390/e25071108.
10
Geometric Partition Entropy: Coarse-Graining a Continuous State Space.几何划分熵:对连续状态空间进行粗粒化
Entropy (Basel). 2022 Oct 8;24(10):1432. doi: 10.3390/e24101432.

引用本文的文献

1
Several Basic Elements of Entropic Statistics.熵统计的几个基本要素。
Entropy (Basel). 2023 Jul 13;25(7):1060. doi: 10.3390/e25071060.
2
Staircase patterns in words: subsequences, subwords, and separation number.单词中的阶梯模式:子序列、子词与分隔数
Eur J Comb. 2020 May;86. Epub 2020 Mar 21.

本文引用的文献

1
Minimax Estimation of Functionals of Discrete Distributions.离散分布泛函的极小极大估计
IEEE Trans Inf Theory. 2015 May;61(5):2835-2885. doi: 10.1109/tit.2015.2412945. Epub 2015 Mar 13.
2
Nonparametric entropy estimation using kernel densities.使用核密度的非参数熵估计
Methods Enzymol. 2009;467:531-546. doi: 10.1016/S0076-6879(09)67020-8.
3
Optimization of mutual information for multiresolution image registration.多分辨率图像配准的互信息优化。
IEEE Trans Image Process. 2000;9(12):2083-99. doi: 10.1109/83.887976.
4
Information-theoretic analysis of interscale and intrascale dependencies between image wavelet coefficients.基于信息论的图像小波系数跨尺度和同尺度相关性分析
IEEE Trans Image Process. 2001;10(11):1647-58. doi: 10.1109/83.967393.
5
A nonparametric statistical method for image segmentation using information theory and curve evolution.一种基于信息论和曲线演化的用于图像分割的非参数统计方法。
IEEE Trans Image Process. 2005 Oct;14(10):1486-502. doi: 10.1109/tip.2005.854442.