Brown G D, Hulme C, Dalloz P
Department of Psychology, University of Warwick, Coventry, UK.
Br J Math Stat Psychol. 1996 May;49 ( Pt 1):1-24. doi: 10.1111/j.2044-8317.1996.tb01072.x.
The mathematical operation of convolution is used as an associative mechanism by several recent influential models of human memory. Convolution can be used to associate two vectors (representing items to be remembered) into a memory trace vector in one operation. An approximation to either of the input vectors can then be retrieved, using the other vector as a probe. Recent convolution-based memory models have accounted for a wide range of data. Connectionist models may have greater potential for providing developmental accounts, but the architectures that have been most widely used to account for developmental phenomena cannot perform one-trial learning and this has limited their use as models of human memory. We show that a connectionist-like architecture can learn, using a gradient-descent algorithm, to perform single-trial learning in a similar manner to convolution. The solution that the network finds leads to less variable retrieval than does convolution. Furthermore, the network can learn to carry out the convolution operation itself. This provides a link between connectionist and convolution approaches, and a basis for models with many of the attractions of both connectionist and convolution approaches.
卷积的数学运算被最近一些有影响力的人类记忆模型用作一种关联机制。卷积可用于在一次运算中将两个向量(表示要记忆的项目)关联成一个记忆痕迹向量。然后可以使用另一个向量作为探针来检索其中一个输入向量的近似值。最近基于卷积的记忆模型已经解释了广泛的数据。联结主义模型在提供发展性解释方面可能具有更大的潜力,但最广泛用于解释发展现象的架构无法进行一次性学习,这限制了它们作为人类记忆模型的应用。我们表明,一种类似联结主义的架构可以使用梯度下降算法进行学习,以类似于卷积的方式执行一次性学习。网络找到的解决方案比卷积产生的检索变异性更小。此外,网络可以学习自行执行卷积运算。这为联结主义和卷积方法之间提供了联系,并为具有联结主义和卷积方法诸多优点的模型奠定了基础。