Tavakoli S Kamyar, Longtin André
Department of Physics and Centre for Neural Dynamics, University of Ottawa, Ottawa, ON, Canada.
Front Syst Neurosci. 2021 Nov 19;15:720744. doi: 10.3389/fnsys.2021.720744. eCollection 2021.
Neural circuits operate with delays over a range of time scales, from a few milliseconds in recurrent local circuitry to tens of milliseconds or more for communication between populations. Modeling usually incorporates single fixed delays, meant to represent the mean conduction delay between neurons making up the circuit. We explore conditions under which the inclusion of more delays in a high-dimensional chaotic neural network leads to a reduction in dynamical complexity, a phenomenon recently described as multi-delay complexity collapse (CC) in delay-differential equations with one to three variables. We consider a recurrent local network of 80% excitatory and 20% inhibitory rate model neurons with 10% connection probability. An increase in the width of the distribution of local delays, even to unrealistically large values, does not cause CC, nor does adding more local delays. Interestingly, multiple small local delays can cause CC provided there is a moderate global delayed inhibitory feedback and random initial conditions. CC then occurs through the settling of transient chaos onto a limit cycle. In this regime, there is a form of noise-induced order in which the mean activity variance decreases as the noise increases and disrupts the synchrony. Another novel form of CC is seen where global delayed feedback causes "dropouts," i.e., epochs of low firing rate network synchrony. Their alternation with epochs of higher firing rate asynchrony closely follows Poisson statistics. Such dropouts are promoted by larger global feedback strength and delay. Finally, periodic driving of the chaotic regime with global feedback can cause CC; the extinction of chaos can outlast the forcing, sometimes permanently. Our results suggest a wealth of phenomena that remain to be discovered in networks with clusters of delays.
神经回路在一系列时间尺度上运行时存在延迟,从循环局部回路中的几毫秒到群体之间通信的几十毫秒或更长时间。建模通常包含单个固定延迟,旨在表示构成回路的神经元之间的平均传导延迟。我们探讨了在高维混沌神经网络中纳入更多延迟会导致动态复杂性降低的条件,这种现象最近在具有一到三个变量的延迟微分方程中被描述为多延迟复杂性崩溃(CC)。我们考虑一个由80%兴奋性和20%抑制性速率模型神经元组成的循环局部网络,连接概率为10%。局部延迟分布宽度的增加,即使达到不切实际的大值,也不会导致CC,增加更多局部延迟也不会。有趣的是,只要存在适度的全局延迟抑制反馈和随机初始条件,多个小的局部延迟就会导致CC。然后,CC通过瞬态混沌稳定到一个极限环而发生。在这种情况下,存在一种噪声诱导的有序形式,其中平均活动方差随着噪声增加并破坏同步性而减小。另一种新颖的CC形式是全局延迟反馈导致“缺失”,即低 firing 率网络同步的时期。它们与较高 firing 率异步时期的交替紧密遵循泊松统计。这种缺失会因更大的全局反馈强度和延迟而加剧。最后,对具有全局反馈的混沌状态进行周期性驱动会导致CC;混沌的消失可能会持续超过强迫作用,有时甚至是永久性的。我们的结果表明,在具有延迟簇的网络中仍有大量现象有待发现。