Center for Systems Neuroscience, Boston University, 2 Cummington Mall, Boston, MA 02215, USA.
Trends Cogn Sci. 2017 Jun;21(6):407-408. doi: 10.1016/j.tics.2017.04.001. Epub 2017 Apr 23.
Humans regularly perform new learning without losing memory for previous information, but neural network models suffer from the phenomenon of catastrophic forgetting in which new learning impairs prior function. A recent article presents an algorithm that spares learning at synapses important for previously learned function, reducing catastrophic forgetting.
人类经常在不遗忘先前信息的情况下进行新的学习,但神经网络模型会出现灾难性遗忘现象,即新的学习会损害先前的功能。最近的一篇文章提出了一种算法,该算法可以避免学习对先前学习的功能很重要的突触,从而减少灾难性遗忘。