Suppr超能文献

在局部连接神经网络中利用自持振荡进行水库计算。

Reservoir computing using self-sustained oscillations in a locally connected neural network.

作者信息

Kawai Yuji, Park Jihoon, Asada Minoru

机构信息

Symbiotic Intelligent Systems Research Center, Institute for Open and Transdisciplinary Research Initiatives, Osaka University, Suita, Osaka, 565-0871, Japan.

Center for Information and Neural Networks, National Institute of Information and Communications Technology, Suita, Osaka, 565-0871, Japan.

出版信息

Sci Rep. 2023 Sep 19;13(1):15532. doi: 10.1038/s41598-023-42812-9.

Abstract

Understanding how the structural organization of neural networks influences their computational capabilities is of great interest to both machine learning and neuroscience communities. In our previous work, we introduced a novel learning system, called the reservoir of basal dynamics (reBASICS), which features a modular neural architecture (small-sized random neural networks) capable of reducing chaoticity of neural activity and of producing stable self-sustained limit cycle activities. The integration of these limit cycles is achieved by linear summation of their weights, and arbitrary time series are learned by modulating these weights. Despite its excellent learning performance, interpreting a modular structure of isolated small networks as a brain network has posed a significant challenge. Here, we investigate how local connectivity, a well-known characteristic of brain networks, contributes to reducing neural system chaoticity and generates self-sustained limit cycles based on empirical experiments. Moreover, we present the learning performance of the locally connected reBASICS in two tasks: a motor timing task and a learning task of the Lorenz time series. Although its performance was inferior to that of modular reBASICS, locally connected reBASICS could learn a time series of tens of seconds while the time constant of neural units was ten milliseconds. This work indicates that the locality of connectivity in neural networks may contribute to generation of stable self-sustained oscillations to learn arbitrary long-term time series, as well as the economy of wiring cost.

摘要

理解神经网络的结构组织如何影响其计算能力,这对机器学习和神经科学界都极具吸引力。在我们之前的工作中,我们引入了一种新颖的学习系统,称为基础动力学储备库(reBASICS),它具有模块化神经架构(小型随机神经网络),能够降低神经活动的混沌性并产生稳定的自持极限环活动。这些极限环的整合通过其权重的线性求和来实现,并且通过调制这些权重来学习任意时间序列。尽管其学习性能出色,但将孤立的小网络的模块化结构解释为脑网络却带来了重大挑战。在此,我们基于实证实验研究局部连通性(脑网络的一个众所周知的特征)如何有助于降低神经系统的混沌性并产生自持极限环。此外,我们展示了局部连通的reBASICS在两项任务中的学习性能:运动定时任务和洛伦兹时间序列的学习任务。尽管其性能不如模块化的reBASICS,但局部连通的reBASICS能够在神经单元的时间常数为十毫秒的情况下学习长达数十秒的时间序列。这项工作表明,神经网络中连通性的局部性可能有助于产生稳定的自持振荡以学习任意的长期时间序列,以及布线成本的经济性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/34c6/10509144/724ab498395c/41598_2023_42812_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验