• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过转移熵加速前馈神经网络中的学习。

Learning in Feedforward Neural Networks Accelerated by Transfer Entropy.

作者信息

Moldovan Adrian, Caţaron Angel, Andonie Răzvan

机构信息

Department of Electronics and Computers, Transilvania University, 500024 Braşov, Romania.

Corporate Technology, Siemens SRL, 500007 Braşov, Romania.

出版信息

Entropy (Basel). 2020 Jan 16;22(1):102. doi: 10.3390/e22010102.

DOI:10.3390/e22010102
PMID:33285877
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7516405/
Abstract

Current neural networks architectures are many times harder to train because of the increasing size and complexity of the used datasets. Our objective is to design more efficient training algorithms utilizing causal relationships inferred from neural networks. The transfer entropy (TE) was initially introduced as an information transfer measure used to quantify the statistical coherence between events (time series). Later, it was related to causality, even if they are not the same. There are only few papers reporting applications of causality or TE in neural networks. Our contribution is an information-theoretical method for analyzing information transfer between the nodes of feedforward neural networks. The information transfer is measured by the TE of feedback neural connections. Intuitively, TE measures the relevance of a connection in the network and the feedback amplifies this connection. We introduce a backpropagation type training algorithm that uses TE feedback connections to improve its performance.

摘要

由于所使用数据集的规模和复杂性不断增加,当前的神经网络架构训练难度要高出许多倍。我们的目标是利用从神经网络推断出的因果关系来设计更高效的训练算法。转移熵(TE)最初被引入作为一种信息传递度量,用于量化事件(时间序列)之间的统计相关性。后来,它与因果关系联系起来,即便二者并不等同。仅有少数论文报道了因果关系或转移熵在神经网络中的应用。我们的贡献在于提出了一种信息论方法,用于分析前馈神经网络节点之间的信息传递。信息传递通过反馈神经连接的转移熵来度量。直观地讲,转移熵衡量了网络中连接的相关性,而反馈会增强这种连接。我们引入了一种反向传播类型的训练算法,该算法使用转移熵反馈连接来提升其性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/0d7187409c8b/entropy-22-00102-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/7f43b358c174/entropy-22-00102-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/42b24b33aae4/entropy-22-00102-g0A2a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/cfe98e98f7f8/entropy-22-00102-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/30a7efd534d3/entropy-22-00102-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/3d1148ed3629/entropy-22-00102-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/0d7187409c8b/entropy-22-00102-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/7f43b358c174/entropy-22-00102-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/42b24b33aae4/entropy-22-00102-g0A2a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/cfe98e98f7f8/entropy-22-00102-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/30a7efd534d3/entropy-22-00102-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/3d1148ed3629/entropy-22-00102-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3ba/7516405/0d7187409c8b/entropy-22-00102-g003.jpg

相似文献

1
Learning in Feedforward Neural Networks Accelerated by Transfer Entropy.通过转移熵加速前馈神经网络中的学习。
Entropy (Basel). 2020 Jan 16;22(1):102. doi: 10.3390/e22010102.
2
Learning in Convolutional Neural Networks Accelerated by Transfer Entropy.基于转移熵加速的卷积神经网络学习
Entropy (Basel). 2021 Sep 16;23(9):1218. doi: 10.3390/e23091218.
3
Transfer Entropy as a Measure of Brain Connectivity: A Critical Analysis With the Help of Neural Mass Models.转移熵作为脑连接性的一种度量:借助神经团模型的批判性分析
Front Comput Neurosci. 2020 Jun 5;14:45. doi: 10.3389/fncom.2020.00045. eCollection 2020.
4
Deep convolutional neural network and IoT technology for healthcare.用于医疗保健的深度卷积神经网络和物联网技术。
Digit Health. 2024 Jan 17;10:20552076231220123. doi: 10.1177/20552076231220123. eCollection 2024 Jan-Dec.
5
Direct Feedback Alignment With Sparse Connections for Local Learning.用于局部学习的具有稀疏连接的直接反馈对齐
Front Neurosci. 2019 May 24;13:525. doi: 10.3389/fnins.2019.00525. eCollection 2019.
6
Kendall transfer entropy: a novel measure for estimating information transfer in complex systems.肯德尔传递熵:一种用于估计复杂系统中信息传递的新方法。
J Neural Eng. 2023 Jul 20;20(4). doi: 10.1088/1741-2552/ace5dd.
7
TRENTOOL: a Matlab open source toolbox to analyse information flow in time series data with transfer entropy.TRENTOOL:一个用于分析时间序列数据中转移熵信息流动的 Matlab 开源工具箱。
BMC Neurosci. 2011 Nov 18;12:119. doi: 10.1186/1471-2202-12-119.
8
Supervised Learning in Neural Networks: Feedback-Network-Free Implementation and Biological Plausibility.神经网络中的监督学习:无反馈网络实现与生物合理性
IEEE Trans Neural Netw Learn Syst. 2022 Dec;33(12):7888-7898. doi: 10.1109/TNNLS.2021.3089134. Epub 2022 Nov 30.
9
Combinatorial evolution of regression nodes in feedforward neural networks.前馈神经网络中回归节点的组合进化。
Neural Netw. 1999 Jan;12(1):175-189. doi: 10.1016/s0893-6080(98)00104-x.
10
Evolutionary optimization framework to train multilayer perceptrons for engineering applications.用于工程应用中训练多层感知器的进化优化框架。
Math Biosci Eng. 2024 Jan 30;21(2):2970-2990. doi: 10.3934/mbe.2024132.

引用本文的文献

1
Nonlinear nexus between cryptocurrency returns and COVID-19 COVID-19 news sentiment.加密货币回报与新冠疫情新闻情绪之间的非线性关系。
J Behav Exp Finance. 2022 Dec;36:100747. doi: 10.1016/j.jbef.2022.100747. Epub 2022 Sep 1.
2
Studying the Evolution of Neural Activation Patterns During Training of Feed-Forward ReLU Networks.研究前馈ReLU网络训练过程中神经激活模式的演变
Front Artif Intell. 2021 Dec 23;4:642374. doi: 10.3389/frai.2021.642374. eCollection 2021.
3
Entropy Method for Decision-Making: Uncertainty Cycles in Tourism Demand.

本文引用的文献

1
Dissecting Deep Learning Networks-Visualizing Mutual Information.剖析深度学习网络——可视化互信息
Entropy (Basel). 2018 Oct 26;20(11):823. doi: 10.3390/e20110823.
2
Transfer Information Energy: A Quantitative Indicator of Information Transfer between Time Series.传递信息能量:时间序列间信息传递的定量指标
Entropy (Basel). 2018 Apr 27;20(5):323. doi: 10.3390/e20050323.
3
Functional Clusters, Hubs, and Communities in the Cortical Microconnectome.皮质微连接组中的功能簇、枢纽和群落
决策的熵方法:旅游需求中的不确定性周期
Entropy (Basel). 2021 Oct 20;23(11):1370. doi: 10.3390/e23111370.
4
Learning in Convolutional Neural Networks Accelerated by Transfer Entropy.基于转移熵加速的卷积神经网络学习
Entropy (Basel). 2021 Sep 16;23(9):1218. doi: 10.3390/e23091218.
5
A fusion of data science and feed-forward neural network-based modelling of COVID-19 outbreak forecasting in IRAQ.融合数据科学和前馈神经网络模型,对伊拉克 COVID-19 疫情进行预测。
J Biomed Inform. 2021 Jun;118:103766. doi: 10.1016/j.jbi.2021.103766. Epub 2021 Apr 22.
Cereb Cortex. 2015 Oct;25(10):3743-57. doi: 10.1093/cercor/bhu252. Epub 2014 Oct 21.
4
Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity.多变量信息论测度揭示了功能磁共振成像连接性中的定向信息结构和与任务相关的变化。
J Comput Neurosci. 2011 Feb;30(1):85-107. doi: 10.1007/s10827-010-0271-2. Epub 2010 Aug 27.
5
Transfer entropy--a model-free measure of effective connectivity for the neurosciences.转移熵——神经科学中一种用于有效连接性的无模型度量。
J Comput Neurosci. 2011 Feb;30(1):45-67. doi: 10.1007/s10827-010-0262-3. Epub 2010 Aug 13.
6
A methodology to explain neural network classification.一种解释神经网络分类的方法。
Neural Netw. 2002 Mar;15(2):237-46. doi: 10.1016/s0893-6080(01)00127-7.
7
Measuring information transfer.测量信息传递。
Phys Rev Lett. 2000 Jul 10;85(2):461-4. doi: 10.1103/PhysRevLett.85.461.