• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

具有依赖性的时间序列数据的神经信息估计器。

Neural Estimator of Information for Time-Series Data with Dependency.

作者信息

Molavipour Sina, Ghourchian Hamid, Bassi Germán, Skoglund Mikael

机构信息

School of Electrical Engineering and Computer Science (EECS), KTH Royal Institute of Technology, 100 44 Stockholm, Sweden.

Ericsson Research, 164 83 Stockholm, Sweden.

出版信息

Entropy (Basel). 2021 May 21;23(6):641. doi: 10.3390/e23060641.

DOI:10.3390/e23060641
PMID:34064014
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8224080/
Abstract

Novel approaches to estimate information measures using neural networks are well-celebrated in recent years both in the information theory and machine learning communities. These neural-based estimators are shown to converge to the true values when estimating mutual information and conditional mutual information using independent samples. However, if the samples in the dataset are not independent, the consistency of these estimators requires further investigation. This is of particular interest for a more complex measure such as the directed information, which is pivotal in characterizing causality and is meaningful over time-dependent variables. The extension of the convergence proof for such cases is not trivial and demands further assumptions on the data. In this paper, we show that our neural estimator for conditional mutual information is consistent when the dataset is generated with samples of a stationary and ergodic source. In other words, we show that our information estimator using neural networks converges asymptotically to the true value with probability one. Besides universal functional approximation of neural networks, a core lemma to show the convergence is Birkhoff's ergodic theorem. Additionally, we use the technique to estimate directed information and demonstrate the effectiveness of our approach in simulations.

摘要

近年来,在信息论和机器学习领域,利用神经网络估计信息度量的新方法备受赞誉。当使用独立样本估计互信息和条件互信息时,这些基于神经网络的估计器被证明能收敛到真实值。然而,如果数据集中的样本不是独立的,这些估计器的一致性需要进一步研究。对于更复杂的度量,如有向信息,这一点尤为重要,有向信息在表征因果关系方面至关重要,并且对于随时间变化的变量具有意义。此类情况下收敛性证明的扩展并非易事,需要对数据做出进一步假设。在本文中,我们表明,当数据集由平稳遍历源的样本生成时,我们的条件互信息神经估计器是一致的。换句话说,我们表明我们使用神经网络的信息估计器以概率 1 渐近收敛到真实值。除了神经网络的通用函数逼近之外,证明收敛性的一个核心引理是伯克霍夫遍历定理。此外,我们使用该技术估计有向信息,并在仿真中证明了我们方法的有效性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/7bd1a62dc63b/entropy-23-00641-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/1e141ec032c4/entropy-23-00641-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/2843285e933b/entropy-23-00641-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/f51a7a669160/entropy-23-00641-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/cf0f185a8151/entropy-23-00641-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/a1a04da127e3/entropy-23-00641-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/7bd1a62dc63b/entropy-23-00641-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/1e141ec032c4/entropy-23-00641-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/2843285e933b/entropy-23-00641-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/f51a7a669160/entropy-23-00641-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/cf0f185a8151/entropy-23-00641-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/a1a04da127e3/entropy-23-00641-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b85b/8224080/7bd1a62dc63b/entropy-23-00641-g006.jpg

相似文献

1
Neural Estimator of Information for Time-Series Data with Dependency.具有依赖性的时间序列数据的神经信息估计器。
Entropy (Basel). 2021 May 21;23(6):641. doi: 10.3390/e23060641.
2
Minimax Estimation of Functionals of Discrete Distributions.离散分布泛函的极小极大估计
IEEE Trans Inf Theory. 2015 May;61(5):2835-2885. doi: 10.1109/tit.2015.2412945. Epub 2015 Mar 13.
3
Pointwise convergence of Birkhoff averages for global observables.全局可观测量的伯克霍夫平均的逐点收敛
Chaos. 2018 Aug;28(8):083111. doi: 10.1063/1.5036652.
4
Geometric Estimation of Multivariate Dependency.多元相关性的几何估计
Entropy (Basel). 2019 Aug 12;21(8):787. doi: 10.3390/e21080787.
5
Statistical foundation of Variational Bayes neural networks.变分贝叶斯神经网络的统计基础。
Neural Netw. 2021 May;137:151-173. doi: 10.1016/j.neunet.2021.01.027. Epub 2021 Feb 5.
6
Birkhoff's theorem, many-body response functions, and the ergodic condition.伯克霍夫定理、多体响应函数与遍历条件。
Phys Rev Lett. 2007 Mar 16;98(11):110403. doi: 10.1103/PhysRevLett.98.110403. Epub 2007 Mar 12.
7
Generalizations of Fano's Inequality for Conditional Information Measures via Majorization Theory.通过优超理论对条件信息测度的法诺不等式进行推广。
Entropy (Basel). 2020 Mar 1;22(3):288. doi: 10.3390/e22030288.
8
Estimating Conditional Transfer Entropy in Time Series Using Mutual Information and Nonlinear Prediction.利用互信息和非线性预测估计时间序列中的条件转移熵
Entropy (Basel). 2020 Oct 3;22(10):1124. doi: 10.3390/e22101124.
9
Estimating the Mutual Information between Two Discrete, Asymmetric Variables with Limited Samples.利用有限样本估计两个离散非对称变量之间的互信息
Entropy (Basel). 2019 Jun 25;21(6):623. doi: 10.3390/e21060623.
10
On the Difference between the Information Bottleneck and the Deep Information Bottleneck.论信息瓶颈与深度信息瓶颈之间的差异。
Entropy (Basel). 2020 Jan 22;22(2):131. doi: 10.3390/e22020131.

引用本文的文献

1
A Method for Estimating the Entropy of Time Series Using Artificial Neural Networks.一种使用人工神经网络估计时间序列熵的方法。
Entropy (Basel). 2021 Oct 29;23(11):1432. doi: 10.3390/e23111432.

本文引用的文献

1
Estimating Conditional Transfer Entropy in Time Series Using Mutual Information and Nonlinear Prediction.利用互信息和非线性预测估计时间序列中的条件转移熵
Entropy (Basel). 2020 Oct 3;22(10):1124. doi: 10.3390/e22101124.
2
Inferring neuronal network functional connectivity with directed information.利用定向信息推断神经网络功能连接性。
J Neurophysiol. 2017 Aug 1;118(2):1055-1069. doi: 10.1152/jn.00086.2017. Epub 2017 May 3.
3
Transfer entropy in physical systems and the arrow of time.物理系统中的转移熵与时间箭头。
Phys Rev E. 2016 Aug;94(2-1):022135. doi: 10.1103/PhysRevE.94.022135. Epub 2016 Aug 24.
4
Quantifying information transfer and mediation along causal pathways in complex systems.量化复杂系统中沿因果路径的信息传递与中介作用。
Phys Rev E Stat Nonlin Soft Matter Phys. 2015 Dec;92(6):062829. doi: 10.1103/PhysRevE.92.062829. Epub 2015 Dec 28.
5
Estimating the decomposition of predictive information in multivariate systems.估计多变量系统中预测信息的分解
Phys Rev E Stat Nonlin Soft Matter Phys. 2015 Mar;91(3):032904. doi: 10.1103/PhysRevE.91.032904. Epub 2015 Mar 6.
6
Transfer entropy--a model-free measure of effective connectivity for the neurosciences.转移熵——神经科学中一种用于有效连接性的无模型度量。
J Comput Neurosci. 2011 Feb;30(1):45-67. doi: 10.1007/s10827-010-0262-3. Epub 2010 Aug 13.
7
Estimating the directed information to infer causal relationships in ensemble neural spike train recordings.估计有向信息以推断群体神经尖峰序列记录中的因果关系。
J Comput Neurosci. 2011 Feb;30(1):17-44. doi: 10.1007/s10827-010-0247-2. Epub 2010 Jun 26.
8
Recurrent Neural Networks are universal approximators.循环神经网络是通用逼近器。
Int J Neural Syst. 2007 Aug;17(4):253-63. doi: 10.1142/S0129065707001111.
9
Estimating mutual information.估计互信息。
Phys Rev E Stat Nonlin Soft Matter Phys. 2004 Jun;69(6 Pt 2):066138. doi: 10.1103/PhysRevE.69.066138. Epub 2004 Jun 23.
10
Statistical assessment of nonlinear causality: application to epileptic EEG signals.非线性因果关系的统计评估:在癫痫脑电信号中的应用。
J Neurosci Methods. 2003 Apr 15;124(2):113-28. doi: 10.1016/s0165-0270(02)00367-9.