• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于智能体学习、序列分析和建模的互信息增益与线性/非线性冗余

Mutual Information Gain and Linear/Nonlinear Redundancy for Agent Learning, Sequence Analysis, and Modeling.

作者信息

Gibson Jerry D

机构信息

Department of Electrical and Computer Engineering, University of California, Santa Barbara, CA 93106-9560, USA.

出版信息

Entropy (Basel). 2020 May 30;22(6):608. doi: 10.3390/e22060608.

DOI:10.3390/e22060608
PMID:33286380
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7517148/
Abstract

In many applications, intelligent agents need to identify any structure or apparent randomness in an environment and respond appropriately. We use the relative entropy to separate and quantify the presence of both linear and nonlinear redundancy in a sequence and we introduce the new quantities of total mutual information gain and incremental mutual information gain. We illustrate how these new quantities can be used to analyze and characterize the structures and apparent randomness for purely autoregressive sequences and for speech signals with long and short term linear redundancies. The mutual information gain is shown to be an important new tool for capturing and quantifying learning for sequence modeling and analysis.

摘要

在许多应用中,智能代理需要识别环境中的任何结构或明显的随机性,并做出适当响应。我们使用相对熵来分离和量化序列中线性和非线性冗余的存在,并引入了总互信息增益和增量互信息增益的新量。我们说明了如何使用这些新量来分析和表征纯自回归序列以及具有长期和短期线性冗余的语音信号的结构和明显的随机性。互信息增益被证明是用于捕获和量化序列建模与分析中的学习的重要新工具。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c7/7517148/59fc64c07d60/entropy-22-00608-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c7/7517148/cc2bdbb2cb92/entropy-22-00608-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c7/7517148/59fc64c07d60/entropy-22-00608-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c7/7517148/cc2bdbb2cb92/entropy-22-00608-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/46c7/7517148/59fc64c07d60/entropy-22-00608-g002.jpg

相似文献

1
Mutual Information Gain and Linear/Nonlinear Redundancy for Agent Learning, Sequence Analysis, and Modeling.用于智能体学习、序列分析和建模的互信息增益与线性/非线性冗余
Entropy (Basel). 2020 May 30;22(6):608. doi: 10.3390/e22060608.
2
Entropy Power, Autoregressive Models, and Mutual Information.熵功率、自回归模型与互信息
Entropy (Basel). 2018 Sep 30;20(10):750. doi: 10.3390/e20100750.
3
Non-linear Feature Extraction by Redundancy Reduction in an Unsupervised Stochastic Neural Network.无监督随机神经网络中通过冗余减少进行非线性特征提取
Neural Netw. 1997 Jun;10(4):683-691. doi: 10.1016/s0893-6080(96)00110-4.
4
Information Entropy Suggests Stronger Nonlinear Associations between Hydro-Meteorological Variables and ENSO.信息熵表明水文气象变量与厄尔尼诺-南方涛动之间存在更强的非线性关联。
Entropy (Basel). 2018 Jan 9;20(1):38. doi: 10.3390/e20010038.
5
Measuring Independence between Statistical Randomness Tests by Mutual Information.通过互信息测量统计随机性检验之间的独立性。
Entropy (Basel). 2020 Jul 4;22(7):741. doi: 10.3390/e22070741.
6
Quantifying Net Synergy/Redundancy of Spontaneous Variability Regulation via Predictability and Transfer Entropy Decomposition Frameworks.通过可预测性和转移熵分解框架量化自发变异性调节的净协同/冗余度。
IEEE Trans Biomed Eng. 2017 Nov;64(11):2628-2638. doi: 10.1109/TBME.2017.2654509.
7
Information-theoretic decomposition of embodied and situated systems.具身与情境系统的信息论分解。
Neural Netw. 2018 Jul;103:94-107. doi: 10.1016/j.neunet.2018.03.011. Epub 2018 Mar 27.
8
Dissecting Deep Learning Networks-Visualizing Mutual Information.剖析深度学习网络——可视化互信息
Entropy (Basel). 2018 Oct 26;20(11):823. doi: 10.3390/e20110823.
9
Principles of Mutual Information Maximization and Energy Minimization Affect the Activation Patterns of Large Scale Networks in the Brain.互信息最大化和能量最小化原则影响大脑中大规模网络的激活模式。
Front Comput Neurosci. 2020 Jan 9;13:86. doi: 10.3389/fncom.2019.00086. eCollection 2019.
10
Regularities unseen, randomness observed: levels of entropy convergence.规律不可见,随机性可见:熵收敛水平。
Chaos. 2003 Mar;13(1):25-54. doi: 10.1063/1.1530990.

本文引用的文献

1
Regularities unseen, randomness observed: levels of entropy convergence.规律不可见,随机性可见:熵收敛水平。
Chaos. 2003 Mar;13(1):25-54. doi: 10.1063/1.1530990.