Suppr超能文献

利用信号传播延迟来匹配储层计算中的任务内存需求。

Exploiting Signal Propagation Delays to Match Task Memory Requirements in Reservoir Computing.

作者信息

Iacob Stefan, Dambre Joni

机构信息

IDLab-AIRO, Ghent University, 9052 Ghent, Belgium.

出版信息

Biomimetics (Basel). 2024 Jun 14;9(6):355. doi: 10.3390/biomimetics9060355.

Abstract

Recurrent neural networks (RNNs) transmit information over time through recurrent connections. In contrast, biological neural networks use many other temporal processing mechanisms. One of these mechanisms is the inter-neuron delays caused by varying axon properties. Recently, this feature was implemented in echo state networks (ESNs), a type of RNN, by assigning spatial locations to neurons and introducing distance-dependent inter-neuron delays. These delays were shown to significantly improve ESN task performance. However, thus far, it is still unclear why distance-based delay networks (DDNs) perform better than ESNs. In this paper, we show that by optimizing inter-node delays, the memory capacity of the network matches the memory requirements of the task. As such, networks concentrate their memory capabilities to the points in the past which contain the most information for the task at hand. Moreover, we show that DDNs have a greater total linear memory capacity, with the same amount of non-linear processing power.

摘要

循环神经网络(RNNs)通过循环连接随时间传递信息。相比之下,生物神经网络使用许多其他时间处理机制。其中一种机制是由不同轴突特性引起的神经元间延迟。最近,这种特征通过为神经元分配空间位置并引入距离依赖的神经元间延迟,在回声状态网络(ESNs,一种RNN)中得以实现。这些延迟被证明能显著提高ESN的任务性能。然而,到目前为止,基于距离的延迟网络(DDNs)为何比ESNs表现更好仍不清楚。在本文中,我们表明通过优化节点间延迟,网络的记忆容量与任务的记忆需求相匹配。因此,网络将其记忆能力集中于过去那些对手头任务包含最多信息的点上。此外,我们表明在具有相同非线性处理能力的情况下,DDNs具有更大的总线性记忆容量。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5d1d/11201534/85d424361ce1/biomimetics-09-00355-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验