Center for Memory and Brain, Boston University, Boston, MA 02215, USA.
Neural Comput. 2012 Jan;24(1):134-93. doi: 10.1162/NECO_a_00212. Epub 2011 Sep 15.
We propose a principled way to construct an internal representation of the temporal stimulus history leading up to the present moment. A set of leaky integrators performs a Laplace transform on the stimulus function, and a linear operator approximates the inversion of the Laplace transform. The result is a representation of stimulus history that retains information about the temporal sequence of stimuli. This procedure naturally represents more recent stimuli more accurately than less recent stimuli; the decrement in accuracy is precisely scale invariant. This procedure also yields time cells that fire at specific latencies following the stimulus with a scale-invariant temporal spread. Combined with a simple associative memory, this representation gives rise to a moment-to-moment prediction that is also scale invariant in time. We propose that this scale-invariant representation of temporal stimulus history could serve as an underlying representation accessible to higher-level behavioral and cognitive mechanisms. In order to illustrate the potential utility of this scale-invariant representation in a variety of fields, we sketch applications using minimal performance functions to problems in classical conditioning, interval timing, scale-invariant learning in autoshaping, and the persistence of the recency effect in episodic memory across timescales.
我们提出了一种构建到当前时刻为止的时间刺激历史内部表示的原则方法。一组漏积分器对刺激函数进行拉普拉斯变换,而线性算子则近似于拉普拉斯变换的逆。结果是一种保留了刺激序列信息的刺激历史表示。该过程自然地比最近的刺激更准确地表示较旧的刺激;准确性的降低精确地是尺度不变的。该过程还产生了时间细胞,它们在刺激后以特定的潜伏期发射,具有尺度不变的时间扩展。与简单的联想记忆相结合,这种表示产生了一种时间也具有尺度不变性的即时预测。我们提出,这种时间刺激历史的尺度不变表示可以作为一种潜在的表示,供更高层次的行为和认知机制使用。为了说明这种尺度不变的时间刺激历史表示在各种领域中的潜在用途,我们使用最小性能函数来勾勒出在经典条件作用、间隔计时、自动塑造中的尺度不变学习以及情节记忆中最近效应的持久性等问题上的应用。