Suppr超能文献

实现分裂归一化的循环神经回路的无条件稳定性。

Unconditional stability of a recurrent neural circuit implementing divisive normalization.

作者信息

Rawat Shivang, Heeger David J, Martiniani Stefano

机构信息

Courant Institute of Mathematical Sciences, NYU.

Center for Soft Matter Research, Department of Physics, NYU.

出版信息

ArXiv. 2025 Jan 15:arXiv:2409.18946v3.

Abstract

Stability in recurrent neural models poses a significant challenge, particularly in developing biologically plausible neurodynamical models that can be seamlessly trained. Traditional cortical circuit models are notoriously difficult to train due to expansive nonlinearities in the dynamical system, leading to an optimization problem with nonlinear stability constraints that are difficult to impose. Conversely, recurrent neural networks (RNNs) excel in tasks involving sequential data but lack biological plausibility and interpretability. In this work, we address these challenges by linking dynamic divisive normalization (DN) to the stability of "oscillatory recurrent gated neural integrator circuits" (ORGaNICs), a biologically plausible recurrent cortical circuit model that dynamically achieves DN and that has been shown to simulate a wide range of neurophysiological phenomena. By using the indirect method of Lyapunov, we prove the remarkable property of unconditional local stability for an arbitrary-dimensional ORGaNICs circuit when the recurrent weight matrix is the identity. We thus connect ORGaNICs to a system of coupled damped harmonic oscillators, which enables us to derive the circuit's energy function, providing a normative principle of what the circuit, and individual neurons, aim to accomplish. Further, for a generic recurrent weight matrix, we prove the stability of the 2D model and demonstrate empirically that stability holds in higher dimensions. Finally, we show that ORGaNICs can be trained by backpropagation through time without gradient clipping/scaling, thanks to its intrinsic stability property and adaptive time constants, which address the problems of exploding, vanishing, and oscillating gradients. By evaluating the model's performance on RNN benchmarks, we find that ORGaNICs outperform alternative neurodynamical models on static image classification tasks and perform comparably to LSTMs on sequential tasks.

摘要

递归神经模型中的稳定性构成了一项重大挑战,尤其是在开发能够无缝训练的具有生物学合理性的神经动力学模型时。传统的皮层电路模型由于动力系统中存在广泛的非线性,训练起来 notoriously 困难,导致出现一个具有难以施加的非线性稳定性约束的优化问题。相反,递归神经网络(RNN)在涉及序列数据的任务中表现出色,但缺乏生物学合理性和可解释性。在这项工作中,我们通过将动态除法归一化(DN)与“振荡递归门控神经积分器电路”(ORGaNICs)的稳定性联系起来,解决了这些挑战。ORGaNICs是一种具有生物学合理性的递归皮层电路模型,它动态地实现DN,并且已被证明能够模拟广泛的神经生理现象。通过使用李雅普诺夫间接方法,我们证明了当递归权重矩阵为单位矩阵时,任意维数的ORGaNICs电路具有无条件局部稳定性这一显著特性。因此,我们将ORGaNICs与一个耦合阻尼谐振子系统联系起来,这使我们能够推导出该电路的能量函数,从而提供了该电路以及单个神经元旨在实现的规范原则。此外,对于一般的递归权重矩阵,我们证明了二维模型的稳定性,并通过实验证明稳定性在更高维度也成立。最后,我们表明,由于其固有的稳定性特性和自适应时间常数,ORGaNICs可以通过时间反向传播进行训练,而无需梯度裁剪/缩放,这解决了梯度爆炸、消失和振荡的问题。通过在RNN基准测试中评估模型的性能,我们发现ORGaNICs在静态图像分类任务上优于其他神经动力学模型,在序列任务上的表现与长短期记忆网络(LSTM)相当。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/de7c/11745109/adaed2a5feb3/nihpp-2409.18946v3-f0002.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验