Karnin E D
IBM Sci. and Technol., Technion City, Haifa.
IEEE Trans Neural Netw. 1990;1(2):239-42. doi: 10.1109/72.80236.
The sensitivity of the global error (cost) function to the inclusion/exclusion of each synapse in the artificial neural network is estimated. Introduced are shadow arrays which keep track of the incremental changes to the synaptic weights during a single pass of back-propagating learning. The synapses are then ordered by decreasing sensitivity numbers so that the network can be efficiently pruned by discarding the last items of the sorted list. Unlike previous approaches, this simple procedure does not require a modification of the cost function, does not interfere with the learning process, and demands a negligible computational overhead.
估计全局误差(成本)函数对人工神经网络中每个突触的包含/排除的敏感度。引入了影子数组,用于跟踪反向传播学习单次迭代期间突触权重的增量变化。然后,根据敏感度数值的降序对突触进行排序,以便通过丢弃排序列表的最后几项来有效地修剪网络。与以前的方法不同,这个简单的过程不需要修改成本函数,不会干扰学习过程,并且计算开销可以忽略不计。