Suppr超能文献

神经网络对钟声进行计数。

Neural networks counting chimes.

作者信息

Amit D J

机构信息

Racah Institute of Physics, Hebrew University, Jerusalem.

出版信息

Proc Natl Acad Sci U S A. 1988 Apr;85(7):2141-5. doi: 10.1073/pnas.85.7.2141.

Abstract

It is shown that the ideas that led to neural networks capable of recalling associatively and asynchronously temporal sequences of patterns can be extended to produce a neural network that automatically counts the cardinal number in a sequence of identical external stimuli. The network is explicitly constructed, analyzed, and simulated. Such a network may account for the cognitive effect of the automatic counting of chimes to tell the hour. A more general implication is that different electrophysiological responses to identical stimuli, at certain stages of cortical processing, do not necessarily imply synaptic modification, a la Hebb. Such differences may arise from the fact that consecutive identical inputs find the network in different stages of an active temporal sequence of cognitive states. These types of networks are then situated within a program for the study of cognition, which assigns the detection of meaning as the primary role of attractor neural networks rather than computation, in contrast to the parallel distributed processing attitude to the connectionist project. This interpretation is free of homunculus, as well as from the criticism raised against the cognitive model of symbol manipulation. Computation is then identified as the syntax of temporal sequences of quasi-attractors.

摘要

结果表明,那些导致神经网络能够关联且异步地回忆模式时间序列的想法,可以扩展到产生一个能够自动对一系列相同外部刺激的基数进行计数的神经网络。该网络被明确构建、分析和模拟。这样的网络可能解释了自动计算钟声次数来报时的认知效应。一个更普遍的含义是,在皮层处理的某些阶段,对相同刺激的不同电生理反应不一定意味着类似赫布理论的突触修饰。这种差异可能源于这样一个事实,即连续的相同输入会使网络处于认知状态的活跃时间序列的不同阶段。然后,这些类型的网络被置于一个认知研究程序中,该程序将意义的检测指定为吸引子神经网络的主要作用而非计算,这与联结主义项目的并行分布式处理态度形成对比。这种解释没有小人谬误,也没有针对符号操作认知模型提出的批评。计算随后被确定为准吸引子时间序列的句法。

相似文献

1
Neural networks counting chimes.神经网络对钟声进行计数。
Proc Natl Acad Sci U S A. 1988 Apr;85(7):2141-5. doi: 10.1073/pnas.85.7.2141.
7
Dynamics and computation of continuous attractors.连续吸引子的动力学与计算
Neural Comput. 2008 Apr;20(4):994-1025. doi: 10.1162/neco.2008.10-06-378.
9
Synaptic dynamics: linear model and adaptation algorithm.突触动力学:线性模型与自适应算法。
Neural Netw. 2014 Aug;56:49-68. doi: 10.1016/j.neunet.2014.04.001. Epub 2014 Apr 28.

引用本文的文献

3
Robust computation with rhythmic spike patterns.具有节律性尖峰模式的鲁棒计算。
Proc Natl Acad Sci U S A. 2019 Sep 3;116(36):18050-18059. doi: 10.1073/pnas.1902653116. Epub 2019 Aug 20.
8

本文引用的文献

1
Temporal association in asymmetric neural networks.非对称神经网络中的时间关联。
Phys Rev Lett. 1986 Dec 1;57(22):2861-2864. doi: 10.1103/PhysRevLett.57.2861.
2
Information storage in neural networks with low levels of activity.低活动水平神经网络中的信息存储
Phys Rev A Gen Phys. 1987 Mar 1;35(5):2293-2303. doi: 10.1103/physreva.35.2293.
3
Spin-glass models of neural networks.神经网络的自旋玻璃模型。
Phys Rev A Gen Phys. 1985 Aug;32(2):1007-1018. doi: 10.1103/physreva.32.1007.
5
A cognitive and associative memory.一种认知和联想记忆。
Biol Cybern. 1987;57(3):197-206. doi: 10.1007/BF00364151.
6
Neural networks that learn temporal sequences by selection.通过选择来学习时间序列的神经网络。
Proc Natl Acad Sci U S A. 1987 May;84(9):2727-31. doi: 10.1073/pnas.84.9.2727.
7
Sequential state generation by model neural networks.模型神经网络的序列状态生成
Proc Natl Acad Sci U S A. 1986 Dec;83(24):9469-73. doi: 10.1073/pnas.83.24.9469.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验