• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

探究潜在线性动力系统与低秩递归神经网络模型之间的关系。

Probing the Relationship Between Latent Linear Dynamical Systems and Low-Rank Recurrent Neural Network Models.

作者信息

Valente Adrian, Ostojic Srdjan, Pillow Jonathan W

机构信息

Laboratoire de Neurosciences Cognitives et Computationnelles, INSERM U960, Ecole Normale Superieure-PSL Research University, 75005 Paris, France

Princeton Neuroscience Institute, Princeton University, Princeton, NJ 08544, U.S.A.

出版信息

Neural Comput. 2022 Aug 16;34(9):1871-1892. doi: 10.1162/neco_a_01522.

DOI:10.1162/neco_a_01522
PMID:35896161
Abstract

A large body of work has suggested that neural populations exhibit low-dimensional dynamics during behavior. However, there are a variety of different approaches for modeling low-dimensional neural population activity. One approach involves latent linear dynamical system (LDS) models, in which population activity is described by a projection of low-dimensional latent variables with linear dynamics. A second approach involves low-rank recurrent neural networks (RNNs), in which population activity arises directly from a low-dimensional projection of past activity. Although these two modeling approaches have strong similarities, they arise in different contexts and tend to have different domains of application. Here we examine the precise relationship between latent LDS models and linear low-rank RNNs. When can one model class be converted to the other, and vice versa? We show that latent LDS models can only be converted to RNNs in specific limit cases, due to the non-Markovian property of latent LDS models. Conversely, we show that linear RNNs can be mapped onto LDS models, with latent dimensionality at most twice the rank of the RNN. A surprising consequence of our results is that a partially observed RNN is better represented by an LDS model than by an RNN consisting of only observed units.

摘要

大量研究表明,神经群体在行为过程中表现出低维动力学。然而,对于低维神经群体活动的建模存在多种不同方法。一种方法涉及潜在线性动力系统(LDS)模型,其中群体活动通过具有线性动力学的低维潜在变量的投影来描述。第二种方法涉及低秩递归神经网络(RNN),其中群体活动直接源于过去活动的低维投影。尽管这两种建模方法有很强的相似性,但它们出现在不同的背景下,并且往往有不同的应用领域。在这里,我们研究潜在LDS模型与线性低秩RNN之间的确切关系。在什么情况下一种模型类别可以转换为另一种,反之亦然?我们表明,由于潜在LDS模型的非马尔可夫性质,潜在LDS模型只能在特定的极限情况下转换为RNN。相反,我们表明线性RNN可以映射到LDS模型上,潜在维度最多为RNN秩的两倍。我们结果的一个惊人结果是,部分观测到的RNN由LDS模型表示比由仅由观测单元组成的RNN更好。

相似文献

1
Probing the Relationship Between Latent Linear Dynamical Systems and Low-Rank Recurrent Neural Network Models.探究潜在线性动力系统与低秩递归神经网络模型之间的关系。
Neural Comput. 2022 Aug 16;34(9):1871-1892. doi: 10.1162/neco_a_01522.
2
Considerations in using recurrent neural networks to probe neural dynamics.使用循环神经网络探究神经动力学的注意事项。
J Neurophysiol. 2019 Dec 1;122(6):2504-2521. doi: 10.1152/jn.00467.2018. Epub 2019 Oct 16.
3
Expressive architectures enhance interpretability of dynamics-based neural population models.可表达架构增强了基于动力学的神经群体模型的可解释性。
Neuron Behav Data Anal Theory. 2023;2023. doi: 10.51628/001c.73987. Epub 2023 Mar 28.
4
Learning dynamical systems by recurrent neural networks from orbits.通过循环神经网络从轨道学习动力系统。
Neural Netw. 1998 Dec;11(9):1589-1599. doi: 10.1016/s0893-6080(98)00098-7.
5
Reconstructing Genetic Regulatory Networks Using Two-Step Algorithms with the Differential Equation Models of Neural Networks.使用两步算法和神经网络的微分方程模型重建遗传调控网络。
Interdiscip Sci. 2018 Dec;10(4):823-835. doi: 10.1007/s12539-017-0254-3. Epub 2017 Jul 26.
6
Shaping Dynamics With Multiple Populations in Low-Rank Recurrent Networks.低秩递归网络中多群体的动态塑造。
Neural Comput. 2021 May 13;33(6):1572-1615. doi: 10.1162/neco_a_01381.
7
Recurrent neural network from adder's perspective: Carry-lookahead RNN.从加法器角度看递归神经网络:先行进位递归神经网络。
Neural Netw. 2021 Dec;144:297-306. doi: 10.1016/j.neunet.2021.08.032. Epub 2021 Sep 6.
8
Structured flexibility in recurrent neural networks via neuromodulation.通过神经调节实现循环神经网络的结构化灵活性。
bioRxiv. 2024 Jul 26:2024.07.26.605315. doi: 10.1101/2024.07.26.605315.
9
Recurrent neural networks with explicit representation of dynamic latent variables can mimic behavioral patterns in a physical inference task.具有显式动态潜在变量表示的递归神经网络可以模拟物理推理任务中的行为模式。
Nat Commun. 2022 Oct 4;13(1):5865. doi: 10.1038/s41467-022-33581-6.
10
Markovian architectural bias of recurrent neural networks.循环神经网络的马尔可夫架构偏差
IEEE Trans Neural Netw. 2004 Jan;15(1):6-15. doi: 10.1109/TNN.2003.820839.

引用本文的文献

1
A neural manifold view of the brain.大脑的神经流形视角。
Nat Neurosci. 2025 Jul 28. doi: 10.1038/s41593-025-02031-z.
2
Elucidating the selection mechanisms in context-dependent computation through low-rank neural network modeling.通过低秩神经网络建模阐明上下文相关计算中的选择机制。
Elife. 2025 Jul 3;13:RP103636. doi: 10.7554/eLife.103636.
3
Transformations in prefrontal ensemble activity underlying rapid threat avoidance learning.快速威胁回避学习背后前额叶神经元集群活动的转变
Curr Biol. 2025 Mar 10;35(5):1128-1136.e4. doi: 10.1016/j.cub.2025.01.010. Epub 2025 Feb 11.
4
Inferring context-dependent computations through linear approximations of prefrontal cortex dynamics.通过前额叶皮层动力学的线性近似来推断上下文相关计算。
Sci Adv. 2024 Dec 20;10(51):eadl4743. doi: 10.1126/sciadv.adl4743. Epub 2024 Dec 18.
5
Humans actively reconfigure neural task states.人类会主动重新配置神经任务状态。
bioRxiv. 2025 Feb 28:2024.09.29.615736. doi: 10.1101/2024.09.29.615736.
6
Shaping dynamical neural computations using spatiotemporal constraints.利用时空约束塑造动态神经计算。
Biochem Biophys Res Commun. 2024 Oct 8;728:150302. doi: 10.1016/j.bbrc.2024.150302. Epub 2024 Jun 25.
7
Transition to chaos separates learning regimes and relates to measure of consciousness in recurrent neural networks.向混沌的转变分隔了学习模式,并与循环神经网络中的意识度量相关。
bioRxiv. 2024 May 15:2024.05.15.594236. doi: 10.1101/2024.05.15.594236.
8
Shaping dynamical neural computations using spatiotemporal constraints.利用时空约束塑造动态神经计算。
ArXiv. 2023 Nov 27:arXiv:2311.15572v1.
9
Geometry of population activity in spiking networks with low-rank structure.具有低秩结构的尖峰网络中群体活动的几何结构。
PLoS Comput Biol. 2023 Aug 7;19(8):e1011315. doi: 10.1371/journal.pcbi.1011315. eCollection 2023 Aug.
10
A unifying perspective on neural manifolds and circuits for cognition.对认知的神经流形和回路的统一观点。
Nat Rev Neurosci. 2023 Jun;24(6):363-377. doi: 10.1038/s41583-023-00693-x. Epub 2023 Apr 13.