Kim Jason Z, Larsen Bart, Parkes Linden
Department of Physics, Cornell University, Ithaca, NY 14853, USA.
Department of Pediatrics, Masonic Institute for the Developing Brain, University of Minnesota.
ArXiv. 2023 Nov 27:arXiv:2311.15572v1.
Dynamics play a critical role in computation. The principled evolution of states over time enables both biological and artificial networks to represent and integrate information to make decisions. In the past few decades, significant multidisciplinary progress has been made in bridging the gap between how we understand biological artificial computation, including how insights gained from one can translate to the other. Research has revealed that neurobiology is a key determinant of brain network architecture, which gives rise to spatiotemporally constrained patterns of activity that underlie computation. Here, we discuss how neural systems use dynamics for computation, and claim that the biological constraints that shape brain networks may be leveraged to improve the implementation of artificial neural networks. To formalize this discussion, we consider a natural artificial analog of the brain that has been used extensively to model neural computation: the recurrent neural network (RNN). In both the brain and the RNN, we emphasize the common computational substrate atop which dynamics occur-the connectivity between neurons-and we explore the unique computational advantages offered by biophysical constraints such as resource efficiency, spatial embedding, and neurodevelopment.
动力学在计算中起着关键作用。状态随时间的有原则演变使生物网络和人工网络都能够表示和整合信息以做出决策。在过去几十年中,在弥合我们对生物计算和人工计算的理解之间的差距方面取得了重大的多学科进展,包括从一方获得的见解如何转化为另一方。研究表明,神经生物学是脑网络架构的关键决定因素,它产生了作为计算基础的时空受限活动模式。在这里,我们讨论神经系统如何利用动力学进行计算,并声称塑造脑网络的生物约束可用于改进人工神经网络的实现。为了使这个讨论形式化,我们考虑一种已被广泛用于模拟神经计算的大脑的自然人工类似物:循环神经网络(RNN)。在大脑和RNN中,我们强调动力学发生的共同计算基础——神经元之间的连接性——并且我们探索生物物理约束(如资源效率、空间嵌入和神经发育)所提供的独特计算优势。