• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

利用贝叶斯置信区间估计熵率。

Estimating entropy rates with Bayesian confidence intervals.

作者信息

Kennel Matthew B, Shlens Jonathon, Abarbanel Henry D I, Chichilnisky E J

机构信息

Institute for Nonlinear Science, University of California, San Diego, La Jolla, CA 92093-0402, USA.

出版信息

Neural Comput. 2005 Jul;17(7):1531-76. doi: 10.1162/0899766053723050.

DOI:10.1162/0899766053723050
PMID:15901407
Abstract

The entropy rate quantifies the amount of uncertainty or disorder produced by any dynamical system. In a spiking neuron, this uncertainty translates into the amount of information potentially encoded and thus the subject of intense theoretical and experimental investigation. Estimating this quantity in observed, experimental data is difficult and requires a judicious selection of probabilistic models, balancing between two opposing biases. We use a model weighting principle originally developed for lossless data compression, following the minimum description length principle. This weighting yields a direct estimator of the entropy rate, which, compared to existing methods, exhibits significantly less bias and converges faster in simulation. With Monte Carlo techinques, we estimate a Bayesian confidence interval for the entropy rate. In related work, we apply these ideas to estimate the information rates between sensory stimuli and neural responses in experimental data (Shlens, Kennel, Abarbanel, & Chichilnisky, in preparation).

摘要

熵率量化了任何动力系统产生的不确定性或无序程度。在一个发放脉冲的神经元中,这种不确定性转化为潜在编码的信息量,因此成为了深入的理论和实验研究的主题。在观测到的实验数据中估计这个量是困难的,需要明智地选择概率模型,在两种相反的偏差之间进行权衡。我们遵循最小描述长度原则,使用最初为无损数据压缩开发的模型加权原理。这种加权产生了熵率的直接估计器,与现有方法相比,它的偏差显著更小,并且在模拟中收敛更快。通过蒙特卡罗技术,我们估计了熵率的贝叶斯置信区间。在相关工作中,我们应用这些想法来估计实验数据中感觉刺激和神经反应之间的信息率(施伦斯、肯内尔、阿巴班内尔和奇奇林斯基,正在准备中)。

相似文献

1
Estimating entropy rates with Bayesian confidence intervals.利用贝叶斯置信区间估计熵率。
Neural Comput. 2005 Jul;17(7):1531-76. doi: 10.1162/0899766053723050.
2
Estimating information rates with confidence intervals in neural spike trains.利用神经脉冲序列中的置信区间估计信息率。
Neural Comput. 2007 Jul;19(7):1683-719. doi: 10.1162/neco.2007.19.7.1683.
3
Optimal instruments and models for noisy chaos.用于噪声混沌的最优仪器和模型。
Chaos. 2007 Dec;17(4):043127. doi: 10.1063/1.2818152.
4
Efficient computation of confidence intervals for Bayesian model predictions based on multidimensional parameter space.基于多维参数空间的贝叶斯模型预测置信区间的高效计算。
Methods Enzymol. 2009;454:213-31. doi: 10.1016/S0076-6879(08)03808-1.
5
Bayesian propensity score analysis for observational data.针对观察性数据的贝叶斯倾向得分分析。
Stat Med. 2009 Jan 15;28(1):94-112. doi: 10.1002/sim.3460.
6
Coverage-adjusted entropy estimation.覆盖调整熵估计
Stat Med. 2007 Sep 20;26(21):4039-60. doi: 10.1002/sim.2942.
7
A Bayesian model comparison approach to inferring positive selection.一种用于推断正选择的贝叶斯模型比较方法。
Mol Biol Evol. 2005 Dec;22(12):2531-40. doi: 10.1093/molbev/msi250. Epub 2005 Aug 24.
8
Large-sample Bayesian posterior distributions for probabilistic sensitivity analysis.用于概率敏感性分析的大样本贝叶斯后验分布。
Med Decis Making. 2006 Sep-Oct;26(5):512-34. doi: 10.1177/0272989X06290487.
9
Positional entropy during pigeon homing II: navigational interpretation of Bayesian latent state models.鸽子归巢过程中的位置熵II:贝叶斯潜在状态模型的导航解释
J Theor Biol. 2004 Mar 7;227(1):25-38. doi: 10.1016/j.jtbi.2003.07.003.
10
Positional entropy during pigeon homing I: application of Bayesian latent state modelling.鸽子归巢过程中的位置熵I:贝叶斯潜在状态建模的应用
J Theor Biol. 2004 Mar 7;227(1):39-50. doi: 10.1016/j.jtbi.2003.07.002.

引用本文的文献

1
Spatio-Temporal Patterns of the SARS-CoV-2 Epidemic in Germany.德国新冠病毒疫情的时空模式
Entropy (Basel). 2023 Jul 29;25(8):1137. doi: 10.3390/e25081137.
2
Natural Language Generation and Understanding of Big Code for AI-Assisted Programming: A Review.用于人工智能辅助编程的大代码自然语言生成与理解:综述
Entropy (Basel). 2023 Jun 1;25(6):888. doi: 10.3390/e25060888.
3
Discrete Information Dynamics with Confidence via the Computational Mechanics Bootstrap: Confidence Sets and Significance Tests for Information-Dynamic Measures.
基于计算力学自展法的带置信度的离散信息动力学:信息动力学测度的置信集与显著性检验
Entropy (Basel). 2020 Jul 17;22(7):782. doi: 10.3390/e22070782.
4
Exploring the Relationship among Predictability, Prediction Accuracy and Data Frequency of Financial Time Series.探索金融时间序列的可预测性、预测准确性与数据频率之间的关系。
Entropy (Basel). 2020 Dec 6;22(12):1381. doi: 10.3390/e22121381.
5
Information processing in the LGN: a comparison of neural codes and cell types.外侧膝状体中的信息处理:神经编码与细胞类型的比较。
Biol Cybern. 2019 Aug;113(4):453-464. doi: 10.1007/s00422-019-00801-0. Epub 2019 Jun 26.
6
A statistical framework for neuroimaging data analysis based on mutual information estimated via a gaussian copula.一种基于通过高斯相依函数估计的互信息的神经影像数据分析统计框架。
Hum Brain Mapp. 2017 Mar;38(3):1541-1573. doi: 10.1002/hbm.23471. Epub 2016 Nov 17.
7
Consequences of converting graded to action potentials upon neural information coding and energy efficiency.神经信息编码和能量效率方面将分级电位转换为动作电位的后果。
PLoS Comput Biol. 2014 Jan;10(1):e1003439. doi: 10.1371/journal.pcbi.1003439. Epub 2014 Jan 23.
8
Synergy, redundancy, and multivariate information measures: an experimentalist's perspective.协同、冗余与多元信息测度:实验主义者的视角
J Comput Neurosci. 2014 Apr;36(2):119-40. doi: 10.1007/s10827-013-0458-4. Epub 2013 Jul 3.
9
Information transmission in cercal giant interneurons is unaffected by axonal conduction noise.腹神经节巨神经元中的信息传递不受轴突传导噪声的影响。
PLoS One. 2012;7(1):e30115. doi: 10.1371/journal.pone.0030115. Epub 2012 Jan 12.
10
Information theory in neuroscience.神经科学中的信息理论
J Comput Neurosci. 2011 Feb;30(1):1-5. doi: 10.1007/s10827-011-0314-3.