Centre for Theoretical Neuroscience, University of Waterloo, Waterloo, ON N2L 3G1 Canada
Neural Comput. 2019 May;31(5):849-869. doi: 10.1162/neco_a_01179. Epub 2019 Mar 18.
We present a new binding operation, vector-derived transformation binding (VTB), for use in vector symbolic architectures (VSA). The performance of VTB is compared to circular convolution, used in holographic reduced representations (HRRs), in terms of list and stack encoding capacity. A special focus is given to the possibility of a neural implementation by the means of the Neural Engineering Framework (NEF). While the scaling of required neural resources is slightly worse for VTB, it is found to be on par with circular convolution for list encoding and better for encoding of stacks. Furthermore, VTB influences the vector length less, which also benefits a neural implementation. Consequently, we argue that VTB is an improvement over HRRs for neurally implemented VSAs.
我们提出了一种新的绑定操作,即向量衍生变换绑定(VTB),用于向量符号体系结构(VSA)。根据列表和堆栈编码容量,将 VTB 的性能与用于全息简化表示(HRR)的循环卷积进行了比较。特别关注通过神经工程框架(NEF)实现的可能性。虽然 VTB 所需的神经资源的扩展稍微差一些,但发现它在列表编码方面与循环卷积相当,在堆栈编码方面更好。此外,VTB 对向量长度的影响较小,这也有利于神经实现。因此,我们认为 VTB 是对神经实现的 VSA 的 HRR 的改进。