Universidad Nacional Autónoma de México, IIMAS, 04510, Mexico City, Mexico.
Universidad de Guadalajara, SUV, 44130, Guadalajara, Mexico.
Sci Rep. 2022 Oct 6;12(1):16703. doi: 10.1038/s41598-022-20798-0.
The Entropic Associative Memory (EAM) holds declarative but distributed representations of remembered objects. These are characterized as functions from features to discrete values in an abstract amodal space. Memory objects are registered or remembered through a declarative operation; memory recognition is defined as a logical test and cues of objects not contained in the memory are rejected directly without search; and memory retrieval is a constructive operation. In its original formulation, the content of basic memory units or cells was either on or off, hence all stored objects had the same weight or strength. In the present weighted version (W-EAM) we introduce a basic learning mechanism to the effect that the values of the cells used in the representation of an object are reinforced by the memory register operation. As memory cells are shared by different representations, the corresponding associations are reinforced too. The memory system supports a second form of learning: the distributed representation generalizes and renders a large set of potential or latent units that can used for recognizing novel inputs, which can in turn be used for improving the performance of both the deep neural networks used for modelling perception and action, and of the memory operations. This process can be performed recurrently in open-ended fashion and can be used in long term learning. An experiment in the phonetic domain using the Mexican Spanish DIMEx100 Corpus was carried out. This corpus was collected in a controlled noise-free environment, and was transcribed manually by human trained phoneticians, but consists of a relatively small number of utterances. DIMEx100 was used to produced the initial state of the perceptual and motor modules, and for testing the performance of the memory system at such state. Then the incremental learning cycle was modelled using the Spanish CIEMPIESS Corpus, consisting of a very large number of noisy untagged speech utterances collected from radio and TV. The results support the viability of the Weighted Entropic Associative Memory for modelling cognitive processes, such as phonetic representation and learning, for the construction of applications, such as speech recognition and synthesis, and as a computational model of natural memory.
熵关联记忆 (EAM) 具有以离散值表示记忆对象的陈述性但分布式表示。这些被描述为从特征到抽象模态空间中离散值的函数。记忆对象通过声明性操作进行注册或记忆;记忆识别被定义为逻辑测试,并且不包含在记忆中的对象的提示被直接拒绝而无需搜索;并且记忆检索是一种建设性的操作。在其原始形式中,基本记忆单元或细胞的内容要么开要么关,因此所有存储的对象都具有相同的权重或强度。在当前的加权版本 (W-EAM) 中,我们引入了一种基本学习机制,即对象表示中使用的细胞的值通过记忆寄存器操作得到加强。由于记忆细胞被不同的表示共享,相应的关联也得到了加强。记忆系统支持第二种形式的学习:分布式表示泛化并呈现出大量潜在或潜在的单元,这些单元可用于识别新的输入,反过来又可用于提高用于建模感知和动作的深度神经网络的性能,以及记忆操作。这个过程可以以开放式的方式重复进行,并可用于长期学习。在语音领域进行了一项实验,使用了墨西哥西班牙语 DIMEx100 语料库。该语料库是在无噪声的受控环境中收集的,并由经过培训的人类语音学家手动转录,但包含的话语数量相对较少。DIMEx100 用于生成感知和运动模块的初始状态,并用于测试记忆系统在此状态下的性能。然后使用包含大量来自广播和电视的嘈杂未标记语音话语的西班牙语 CIEMPIESS 语料库来模拟增量学习周期。结果支持加权熵关联记忆用于对认知过程(例如语音表示和学习)进行建模,用于构建应用程序(例如语音识别和合成),以及作为自然记忆的计算模型。