Suppr超能文献

神经储能器的最优模块化与记忆容量

Optimal modularity and memory capacity of neural reservoirs.

作者信息

Rodriguez Nathaniel, Izquierdo Eduardo, Ahn Yong-Yeol

机构信息

School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, USA.

出版信息

Netw Neurosci. 2019 Apr 1;3(2):551-566. doi: 10.1162/netn_a_00082. eCollection 2019.

Abstract

The neural network is a powerful computing framework that has been exploited by biological evolution and by humans for solving diverse problems. Although the computational capabilities of neural networks are determined by their structure, the current understanding of the relationships between a neural network's architecture and function is still primitive. Here we reveal that a neural network's modular architecture plays a vital role in determining the neural dynamics and memory performance of the network of threshold neurons. In particular, we demonstrate that there exists an optimal modularity for memory performance, where a balance between local cohesion and global connectivity is established, allowing optimally modular networks to remember longer. Our results suggest that insights from dynamical analysis of neural networks and information-spreading processes can be leveraged to better design neural networks and may shed light on the brain's modular organization.

摘要

神经网络是一种强大的计算框架,它已被生物进化和人类用于解决各种问题。尽管神经网络的计算能力由其结构决定,但目前对神经网络架构与功能之间关系的理解仍然很原始。在这里,我们揭示了神经网络的模块化架构在决定阈值神经元网络的神经动力学和记忆性能方面起着至关重要的作用。特别是,我们证明存在一种记忆性能的最优模块化,即在局部凝聚和全局连通性之间建立平衡,使最优模块化网络能够记住更长时间。我们的结果表明,神经网络动力学分析和信息传播过程的见解可用于更好地设计神经网络,并可能为大脑的模块化组织提供启示。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c84f/6497001/7c50ea48cf4b/netn-03-551-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验