• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于构建流形目标递归神经网络的微分几何方法。

Differential Geometry Methods for Constructing Manifold-Targeted Recurrent Neural Networks.

机构信息

Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London, London W1T 4JG, U.K.

出版信息

Neural Comput. 2022 Jul 14;34(8):1790-1811. doi: 10.1162/neco_a_01511.

DOI:10.1162/neco_a_01511
PMID:35798324
Abstract

Neural computations can be framed as dynamical processes, whereby the structure of the dynamics within a neural network is a direct reflection of the computations that the network performs. A key step in generating mechanistic interpretations within this computation through dynamics framework is to establish the link among network connectivity, dynamics, and computation. This link is only partly understood. Recent work has focused on producing algorithms for engineering artificial recurrent neural networks (RNN) with dynamics targeted to a specific goal manifold. Some of these algorithms require only a set of vectors tangent to the target manifold to be computed and thus provide a general method that can be applied to a diverse set of problems. Nevertheless, computing such vectors for an arbitrary manifold in a high-dimensional state space remains highly challenging, which in practice limits the applicability of this approach. Here we demonstrate how topology and differential geometry can be leveraged to simplify this task by first computing tangent vectors on a low-dimensional topological manifold and then embedding these in state space. The simplicity of this procedure greatly facilitates the creation of manifold-targeted RNNs, as well as the process of designing task-solving, on-manifold dynamics. This new method should enable the application of network engineering-based approaches to a wide set of problems in neuroscience and machine learning. Our description of how fundamental concepts from differential geometry can be mapped onto different aspects of neural dynamics is a further demonstration of how the language of differential geometry can enrich the conceptual framework for describing neural dynamics and computation.

摘要

神经计算可以被构造成动态过程,其中神经网络内的动力学结构直接反映了网络执行的计算。在通过动力学框架在这种计算中生成机制解释的关键步骤是在网络连接、动力学和计算之间建立联系。这种联系只是部分理解的。最近的工作集中在产生算法,以工程具有针对特定目标流形的动力学的人工递归神经网络 (RNN)。其中一些算法只需要计算一组切向于目标流形的向量,因此提供了一种可以应用于各种问题的通用方法。然而,在高维状态空间中计算任意流形的这样的向量仍然具有很大的挑战性,这在实践中限制了这种方法的适用性。在这里,我们展示了如何利用拓扑和微分几何通过首先在低维拓扑流形上计算切向量,然后将这些向量嵌入到状态空间中,从而简化这项任务。这个过程的简单性极大地促进了针对流形的 RNN 的创建,以及在流形上解决任务的动力学的设计过程。这种新方法应该能够将基于网络工程的方法应用于神经科学和机器学习中的广泛问题。我们描述了微分几何的基本概念如何映射到神经动力学的不同方面,这进一步证明了微分几何的语言如何丰富描述神经动力学和计算的概念框架。

相似文献

1
Differential Geometry Methods for Constructing Manifold-Targeted Recurrent Neural Networks.用于构建流形目标递归神经网络的微分几何方法。
Neural Comput. 2022 Jul 14;34(8):1790-1811. doi: 10.1162/neco_a_01511.
2
Learning to represent continuous variables in heterogeneous neural networks.在异构神经网络中学习表示连续变量。
Cell Rep. 2022 Apr 5;39(1):110612. doi: 10.1016/j.celrep.2022.110612.
3
Reconstructing Genetic Regulatory Networks Using Two-Step Algorithms with the Differential Equation Models of Neural Networks.使用两步算法和神经网络的微分方程模型重建遗传调控网络。
Interdiscip Sci. 2018 Dec;10(4):823-835. doi: 10.1007/s12539-017-0254-3. Epub 2017 Jul 26.
4
Considerations in using recurrent neural networks to probe neural dynamics.使用循环神经网络探究神经动力学的注意事项。
J Neurophysiol. 2019 Dec 1;122(6):2504-2521. doi: 10.1152/jn.00467.2018. Epub 2019 Oct 16.
5
Engineering recurrent neural networks from task-relevant manifolds and dynamics.从任务相关流形和动力学中设计递归神经网络。
PLoS Comput Biol. 2020 Aug 12;16(8):e1008128. doi: 10.1371/journal.pcbi.1008128. eCollection 2020 Aug.
6
Learning a discriminative SPD manifold neural network for image set classification.学习用于图像集分类的判别 SPD 流形神经网络。
Neural Netw. 2022 Jul;151:94-110. doi: 10.1016/j.neunet.2022.03.012. Epub 2022 Mar 16.
7
The dynamics of discrete-time computation, with application to recurrent neural networks and finite state machine extraction.离散时间计算的动力学,及其在递归神经网络和有限状态机提取中的应用。
Neural Comput. 1996 Aug 15;8(6):1135-78. doi: 10.1162/neco.1996.8.6.1135.
8
A Geometrical Analysis of Global Stability in Trained Feedback Networks.训练反馈网络全局稳定性的几何分析。
Neural Comput. 2019 Jun;31(6):1139-1182. doi: 10.1162/neco_a_01187. Epub 2019 Apr 12.
9
A singular Riemannian geometry approach to deep neural networks II. Reconstruction of 1-D equivalence classes.深度神经网络的奇异黎曼几何方法II. 一维等价类的重构
Neural Netw. 2023 Jan;158:344-358. doi: 10.1016/j.neunet.2022.11.026. Epub 2022 Nov 23.
10
Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics.反向传播算法和递归神经网络中的 Reservoir Computing 在复杂时空动力学预测中的应用。
Neural Netw. 2020 Jun;126:191-217. doi: 10.1016/j.neunet.2020.02.016. Epub 2020 Mar 21.