Acharya Jyotibdha, Basu Arindam, Legenstein Robert, Limbacher Thomas, Poirazi Panayiota, Wu Xundong
Institute of Infocomm Research, A*STAR, Singapore.
Department of Electrical Engineering, City University of Hong Kong, Hong Kong.
Neuroscience. 2022 May 1;489:275-289. doi: 10.1016/j.neuroscience.2021.10.001. Epub 2021 Oct 14.
In this paper, we discuss the nonlinear computational power provided by dendrites in biological and artificial neurons. We start by briefly presenting biological evidence about the type of dendritic nonlinearities, respective plasticity rules and their effect on biological learning as assessed by computational models. Four major computational implications are identified as improved expressivity, more efficient use of resources, utilizing internal learning signals, and enabling continual learning. We then discuss examples of how dendritic computations have been used to solve real-world classification problems with performance reported on well known data sets used in machine learning. The works are categorized according to the three primary methods of plasticity used-structural plasticity, weight plasticity, or plasticity of synaptic delays. Finally, we show the recent trend of confluence between concepts of deep learning and dendritic computations and highlight some future research directions.
在本文中,我们讨论了生物神经元和人工神经元中树突所提供的非线性计算能力。我们首先简要介绍有关树突非线性类型、各自的可塑性规则及其对生物学习影响的生物学证据,这些证据是通过计算模型评估得出的。我们确定了四个主要的计算意义,即提高表达能力、更有效地利用资源、利用内部学习信号以及实现持续学习。然后,我们讨论了树突计算如何用于解决现实世界中的分类问题的示例,并报告了在机器学习中使用的知名数据集上的性能。这些工作根据所使用的三种主要可塑性方法进行分类——结构可塑性、权重可塑性或突触延迟可塑性。最后,我们展示了深度学习概念与树突计算之间融合的最新趋势,并强调了一些未来的研究方向。