Department of Neuroscience, University of Pennsylvania, Philadelphia, PA 19104, U.S.A.
Departments of Neuroscience and Bioengineering, University of Pennsylvania, Philadelphia, PA 19104, U.S.A.
Neural Comput. 2021 May 13;33(6):1554-1571. doi: 10.1162/neco_a_01390.
Physiological experiments have highlighted how the dendrites of biological neurons can nonlinearly process distributed synaptic inputs. However, it is unclear how aspects of a dendritic tree, such as its branched morphology or its repetition of presynaptic inputs, determine neural computation beyond this apparent nonlinearity. Here we use a simple model where the dendrite is implemented as a sequence of thresholded linear units. We manipulate the architecture of this model to investigate the impacts of binary branching constraints and repetition of synaptic inputs on neural computation. We find that models with such manipulations can perform well on machine learning tasks, such as Fashion MNIST or Extended MNIST. We find that model performance on these tasks is limited by binary tree branching and dendritic asymmetry and is improved by the repetition of synaptic inputs to different dendritic branches. These computational experiments further neuroscience theory on how different dendritic properties might determine neural computation of clearly defined tasks.
生理实验强调了生物神经元的树突如何对分布式突触输入进行非线性处理。然而,目前尚不清楚树突的分支形态或重复的突触输入等方面如何在这种明显的非线性之外决定神经计算。在这里,我们使用一个简单的模型,其中树突被实现为一系列门限线性单元。我们操纵该模型的结构,以研究二叉分支约束和突触输入重复对神经计算的影响。我们发现,经过这些操作的模型可以在机器学习任务(如 Fashion MNIST 或 Extended MNIST)上表现良好。我们发现,这些任务的模型性能受到二叉树分支和树突不对称性的限制,并通过不同树突分支的突触输入重复得到改善。这些计算实验进一步验证了神经科学理论,即不同的树突特性如何可能决定明确任务的神经计算。