• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

递归神经网络中的自适应时间尺度。

Adaptive time scales in recurrent neural networks.

机构信息

Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands.

出版信息

Sci Rep. 2020 Jul 9;10(1):11360. doi: 10.1038/s41598-020-68169-x.

DOI:10.1038/s41598-020-68169-x
PMID:32647161
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7347927/
Abstract

Recent experiments have revealed a hierarchy of time scales in the visual cortex, where different stages of the visual system process information at different time scales. Recurrent neural networks are ideal models to gain insight in how information is processed by such a hierarchy of time scales and have become widely used to model temporal dynamics both in machine learning and computational neuroscience. However, in the derivation of such models as discrete time approximations of the firing rate of a population of neurons, the time constants of the neuronal process are generally ignored. Learning these time constants could inform us about the time scales underlying temporal processes in the brain and enhance the expressive capacity of the network. To investigate the potential of adaptive time constants, we compare the standard approximations to a more lenient one that accounts for the time scales at which processes unfold. We show that such a model performs better on predicting simulated neural data and allows recovery of the time scales at which the underlying processes unfold. A hierarchy of time scales emerges when adapting to data with multiple underlying time scales, underscoring the importance of such a hierarchy in processing complex temporal information.

摘要

最近的实验揭示了视觉皮层中的时间尺度层次结构,其中视觉系统的不同阶段以不同的时间尺度处理信息。递归神经网络是深入了解此类时间尺度层次结构如何处理信息的理想模型,已广泛用于机器学习和计算神经科学中对时间动态的建模。然而,在将此类模型作为神经元群体的发放率的离散时间逼近进行推导时,通常会忽略神经元过程的时间常数。学习这些时间常数可以使我们了解大脑中时间过程的基础时间尺度,并增强网络的表达能力。为了研究自适应时间常数的潜力,我们将标准逼近与更宽松的逼近进行了比较,该逼近考虑了过程展开的时间尺度。我们表明,该模型在预测模拟神经数据方面表现更好,并允许恢复基础过程展开的时间尺度。当适应具有多个基础时间尺度的数据时,时间尺度层次结构就会出现,这突显了这种层次结构在处理复杂时间信息方面的重要性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/943c27a736d5/41598_2020_68169_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/b51cb9ce6d0c/41598_2020_68169_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/1e66f28eb839/41598_2020_68169_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/ca7d6de8a8c8/41598_2020_68169_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/cd44fff44aa4/41598_2020_68169_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/21e0004121cc/41598_2020_68169_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/f11da6d3cd85/41598_2020_68169_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/f7c11bacd723/41598_2020_68169_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/943c27a736d5/41598_2020_68169_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/b51cb9ce6d0c/41598_2020_68169_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/1e66f28eb839/41598_2020_68169_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/ca7d6de8a8c8/41598_2020_68169_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/cd44fff44aa4/41598_2020_68169_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/21e0004121cc/41598_2020_68169_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/f11da6d3cd85/41598_2020_68169_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/f7c11bacd723/41598_2020_68169_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/572c/7347927/943c27a736d5/41598_2020_68169_Fig8_HTML.jpg

相似文献

1
Adaptive time scales in recurrent neural networks.递归神经网络中的自适应时间尺度。
Sci Rep. 2020 Jul 9;10(1):11360. doi: 10.1038/s41598-020-68169-x.
2
[Dynamic paradigm in psychopathology: "chaos theory", from physics to psychiatry].[精神病理学中的动态范式:“混沌理论”,从物理学到精神病学]
Encephale. 2001 May-Jun;27(3):260-8.
3
Deep recurrent neural network reveals a hierarchy of process memory during dynamic natural vision.深度递归神经网络揭示了动态自然视觉过程记忆的层次结构。
Hum Brain Mapp. 2018 May;39(5):2269-2282. doi: 10.1002/hbm.24006. Epub 2018 Feb 12.
4
Neural mechanism of dynamic responses of neurons in inferior temporal cortex in face perception.颞下回神经元在面孔知觉中动态反应的神经机制。
Cogn Neurodyn. 2013 Feb;7(1):23-38. doi: 10.1007/s11571-012-9212-2. Epub 2012 Jul 20.
5
Neuronal Sequence Models for Bayesian Online Inference.用于贝叶斯在线推理的神经元序列模型
Front Artif Intell. 2021 May 21;4:530937. doi: 10.3389/frai.2021.530937. eCollection 2021.
6
Hybrid Cubature Kalman filtering for identifying nonlinear models from sampled recording: Estimation of neuronal dynamics.用于从采样记录中识别非线性模型的混合容积卡尔曼滤波:神经元动力学估计
PLoS One. 2017 Jul 20;12(7):e0181513. doi: 10.1371/journal.pone.0181513. eCollection 2017.
7
Deep neural networks effectively model neural adaptation to changing background noise and suggest nonlinear noise filtering methods in auditory cortex.深度神经网络有效地模拟了神经对变化背景噪声的适应,并提出了听觉皮层中的非线性噪声滤波方法。
Neuroimage. 2023 Feb 1;266:119819. doi: 10.1016/j.neuroimage.2022.119819. Epub 2022 Dec 16.
8
Heterogeneity in Neuronal Dynamics Is Learned by Gradient Descent for Temporal Processing Tasks.神经元动力学的异质性通过梯度下降学习来处理时间处理任务。
Neural Comput. 2023 Mar 18;35(4):555-592. doi: 10.1162/neco_a_01571.
9
NeuCube: a spiking neural network architecture for mapping, learning and understanding of spatio-temporal brain data.神经立方:一种用于映射、学习和理解时空脑数据的脉冲神经网络架构。
Neural Netw. 2014 Apr;52:62-76. doi: 10.1016/j.neunet.2014.01.006. Epub 2014 Jan 20.
10
A hierarchy of time-scales and the brain.时间尺度层次与大脑。
PLoS Comput Biol. 2008 Nov;4(11):e1000209. doi: 10.1371/journal.pcbi.1000209. Epub 2008 Nov 14.

引用本文的文献

1
The neural basis of event segmentation: Stable features in the environment are reflected by neural states.事件分割的神经基础:环境中的稳定特征由神经状态反映出来。
Imaging Neurosci (Camb). 2025 Jan 15;3. doi: 10.1162/imag_a_00432. eCollection 2025.
2
Coincidence detection and integration behavior in spiking neural networks.脉冲神经网络中的巧合检测与整合行为
Cogn Neurodyn. 2024 Aug;18(4):1753-1765. doi: 10.1007/s11571-023-10038-0. Epub 2023 Dec 13.
3
Exploiting Signal Propagation Delays to Match Task Memory Requirements in Reservoir Computing.

本文引用的文献

1
Engineering a Less Artificial Intelligence.设计一个不那么人工智能的人工智能。
Neuron. 2019 Sep 25;103(6):967-979. doi: 10.1016/j.neuron.2019.08.034.
2
Emergent mechanisms of evidence integration in recurrent neural networks.递归神经网络中证据整合的涌现机制。
PLoS One. 2018 Oct 16;13(10):e0205676. doi: 10.1371/journal.pone.0205676. eCollection 2018.
3
Discovering Event Structure in Continuous Narrative Perception and Memory.在连续叙事感知与记忆中发现事件结构
利用信号传播延迟来匹配储层计算中的任务内存需求。
Biomimetics (Basel). 2024 Jun 14;9(6):355. doi: 10.3390/biomimetics9060355.
4
SHIP: a computational framework for simulating and validating novel technologies in hardware spiking neural networks.SHIP:一种用于模拟和验证硬件脉冲神经网络中的新技术的计算框架。
Front Neurosci. 2024 Jan 8;17:1270090. doi: 10.3389/fnins.2023.1270090. eCollection 2023.
5
The neuroconnectionist research programme.神经连接主义研究计划。
Nat Rev Neurosci. 2023 Jul;24(7):431-450. doi: 10.1038/s41583-023-00705-w. Epub 2023 May 30.
6
Scaffolding layered control architectures through constraint closure: insights into brain evolution and development.通过约束封闭构建分层控制架构:对大脑进化和发育的洞察。
Philos Trans R Soc Lond B Biol Sci. 2022 Feb 14;377(1844):20200519. doi: 10.1098/rstb.2020.0519. Epub 2021 Dec 27.
7
Neural heterogeneity promotes robust learning.神经异质性促进了稳健的学习。
Nat Commun. 2021 Oct 4;12(1):5791. doi: 10.1038/s41467-021-26022-3.
8
Few-shot pulse wave contour classification based on multi-scale feature extraction.基于多尺度特征提取的少脉冲波轮廓分类。
Sci Rep. 2021 Feb 12;11(1):3762. doi: 10.1038/s41598-021-83134-y.
9
Constructing and Forgetting Temporal Context in the Human Cerebral Cortex.在人类大脑皮层中构建和遗忘时间上下文。
Neuron. 2020 May 20;106(4):675-686.e11. doi: 10.1016/j.neuron.2020.02.013. Epub 2020 Mar 11.
Neuron. 2017 Aug 2;95(3):709-721.e5. doi: 10.1016/j.neuron.2017.06.041.
4
Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks.循环神经网络中符合生物学原理的学习再现了认知任务期间观察到的神经动力学。
Elife. 2017 Feb 23;6:e20899. doi: 10.7554/eLife.20899.
5
Reward-based training of recurrent neural networks for cognitive and value-based tasks.用于认知和基于价值任务的循环神经网络的基于奖励的训练。
Elife. 2017 Jan 13;6:e21492. doi: 10.7554/eLife.21492.
6
Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.用于认知任务的兴奋性-抑制性循环神经网络训练:一个简单灵活的框架。
PLoS Comput Biol. 2016 Feb 29;12(2):e1004792. doi: 10.1371/journal.pcbi.1004792. eCollection 2016 Feb.
7
Deep Neural Networks Reveal a Gradient in the Complexity of Neural Representations across the Ventral Stream.深度神经网络揭示了腹侧流中神经表征复杂性的梯度变化。
J Neurosci. 2015 Jul 8;35(27):10005-14. doi: 10.1523/JNEUROSCI.5023-14.2015.
8
Self-Organization of Spatio-Temporal Hierarchy via Learning of Dynamic Visual Image Patterns on Action Sequences.通过学习动作序列上的动态视觉图像模式实现时空层次结构的自组织
PLoS One. 2015 Jul 6;10(7):e0131214. doi: 10.1371/journal.pone.0131214. eCollection 2015.
9
Cell assemblies in the cerebral cortex.大脑皮层中的细胞集合体。
Biol Cybern. 2014 Oct;108(5):559-72. doi: 10.1007/s00422-014-0596-4. Epub 2014 Apr 2.
10
Slow cortical dynamics and the accumulation of information over long timescales.缓慢的皮质动力学和长时间尺度上信息的积累。
Neuron. 2012 Oct 18;76(2):423-34. doi: 10.1016/j.neuron.2012.08.011. Epub 2012 Oct 17.