Marks Ii R J
Appl Opt. 1987 May 15;26(10):2005-10. doi: 10.1364/AO.26.002005.
A neural net capable of restoring continuous level library vectors from memory is considered. As with Hopfield's neural net content addressable memory, the vectors in the memory library are used to program the neural interconnects. Given a portion of one of the library vectors, the net extrapolates the remainder. Sufficient conditions for convergence are stated. Effects of processor inexactitude and net faults are discussed. A more efficient computational technique for performing the memory extrapolation (at the cost of fault tolerance) is derived. The special case of table lookup memories is addressed specifically.
考虑一种能够从内存中恢复连续电平库向量的神经网络。与霍普菲尔德的神经网络内容可寻址存储器一样,内存库中的向量用于对神经互连进行编程。给定库向量之一的一部分,网络会外推其余部分。陈述了收敛的充分条件。讨论了处理器不精确性和网络故障的影响。推导了一种用于执行内存外推的更高效计算技术(以容错为代价)。特别讨论了查表存储器的特殊情况。