Zenke Friedemann, Poole Ben, Ganguli Surya
Stanford University.
Proc Mach Learn Res. 2017;70:3987-3995.
While deep learning has led to remarkable advances across diverse applications, it struggles in domains where the data distribution changes over the course of learning. In stark contrast, biological neural networks continually adapt to changing domains, possibly by leveraging complex molecular machinery to solve many tasks simultaneously. In this study, we introduce that bring some of this biological complexity into artificial neural networks. Each synapse accumulates task relevant information over time, and exploits this information to rapidly store new memories without forgetting old ones. We evaluate our approach on continual learning of classification tasks, and show that it dramatically reduces forgetting while maintaining computational efficiency.
虽然深度学习在各种应用中取得了显著进展,但在数据分布在学习过程中发生变化的领域中却面临困难。与之形成鲜明对比的是,生物神经网络能够持续适应不断变化的领域,可能是通过利用复杂的分子机制同时解决许多任务。在本研究中,我们引入了一些方法,将这种生物复杂性引入人工神经网络。每个突触会随着时间积累与任务相关的信息,并利用这些信息快速存储新记忆而不会忘记旧记忆。我们在分类任务的持续学习中评估了我们的方法,并表明它在保持计算效率的同时显著减少了遗忘。