Atmanspacher Harald, Filk Thomas
Institute for Frontier Areas of Psychology and Mental Health, Wilhelmstr. 3a, 79098 Freiburg, Germany.
Biosystems. 2006 Jul;85(1):84-93. doi: 10.1016/j.biosystems.2006.03.001. Epub 2006 May 9.
We present results from numerical studies of supervised learning operations in small recurrent networks considered as graphs, leading from a given set of input conditions to predetermined outputs. Graphs that have optimized their output for particular inputs with respect to predetermined outputs are asymptotically stable and can be characterized by attractors, which form a representation space for an associative multiplicative structure of input operations. As the mapping from a series of inputs onto a series of such attractors generally depends on the sequence of inputs, this structure is generally non-commutative. Moreover, the size of the set of attractors, indicating the complexity of learning, is found to behave non-monotonically as learning proceeds. A tentative relation between this complexity and the notion of pragmatic information is indicated.
我们展示了将小型递归网络视为图时监督学习操作的数值研究结果,这些网络从给定的一组输入条件导向预定输出。对于特定输入已针对预定输出优化其输出的图是渐近稳定的,并且可以由吸引子来表征,吸引子形成了输入操作的关联乘法结构的表示空间。由于从一系列输入到一系列此类吸引子的映射通常取决于输入序列,所以这种结构通常是非交换的。此外,发现表示学习复杂性的吸引子集的大小在学习过程中表现出非单调性。文中指出了这种复杂性与语用信息概念之间的一种初步关系。