Faculty of Electrical Engineering and Computer Science, Technical University of Berlin, Berlin, Germany.
Science of Intelligence, Research Cluster of Excellence, Berlin, Germany.
Elife. 2024 Nov 27;13:RP93060. doi: 10.7554/eLife.93060.
The relation between neural activity and behaviorally relevant variables is at the heart of neuroscience research. When strong, this relation is termed a neural representation. There is increasing evidence, however, for partial dissociations between activity in an area and relevant external variables. While many explanations have been proposed, a theoretical framework for the relationship between external and internal variables is lacking. Here, we utilize recurrent neural networks (RNNs) to explore the question of when and how neural dynamics and the network's output are related from a geometrical point of view. We find that training RNNs can lead to two dynamical regimes: dynamics can either be aligned with the directions that generate output variables, or oblique to them. We show that the choice of readout weight magnitude before training can serve as a control knob between the regimes, similar to recent findings in feedforward networks. These regimes are functionally distinct. Oblique networks are more heterogeneous and suppress noise in their output directions. They are furthermore more robust to perturbations along the output directions. Crucially, the oblique regime is specific to recurrent (but not feedforward) networks, arising from dynamical stability considerations. Finally, we show that tendencies toward the aligned or the oblique regime can be dissociated in neural recordings. Altogether, our results open a new perspective for interpreting neural activity by relating network dynamics and their output.
神经活动与行为相关变量之间的关系是神经科学研究的核心。当这种关系很强时,就被称为神经表示。然而,越来越多的证据表明,一个区域的活动与相关的外部变量之间存在部分分离。虽然已经提出了许多解释,但缺乏外部和内部变量之间关系的理论框架。在这里,我们利用递归神经网络(RNN)从几何角度探讨了神经动力学及其网络输出何时以及如何相关的问题。我们发现,训练 RNN 可以导致两种动力学状态:动力学可以与产生输出变量的方向一致,也可以与它们成斜交。我们表明,在训练之前选择读取权重幅度可以作为两个状态之间的控制旋钮,类似于前馈网络中的最近发现。这些状态在功能上是不同的。斜交网络在其输出方向上更加异构并且抑制噪声。此外,它们对沿输出方向的扰动更具鲁棒性。至关重要的是,斜交状态是特定于递归(而不是前馈)网络的,这是由动力学稳定性考虑引起的。最后,我们表明,在神经记录中可以分离出倾向于对齐或斜交的状态。总之,我们的结果通过将网络动力学与其输出相关联,为解释神经活动开辟了新的视角。