Shen Yang, Dasgupta Sanjoy, Navlakha Saket
Cold Spring Harbor Laboratory, Simons Center for Quantitative Biology, Cold Spring Harbor, NY 11724, U.S.A.
Department of Computer Science and Engineering, University of California, San Diego, La Jolla, CA 92093, U.S.A.
Neural Comput. 2023 Oct 10;35(11):1797-1819. doi: 10.1162/neco_a_01615.
Catastrophic forgetting remains an outstanding challenge in continual learning. Recently, methods inspired by the brain, such as continual representation learning and memory replay, have been used to combat catastrophic forgetting. Associative learning (retaining associations between inputs and outputs, even after good representations are learned) plays an important function in the brain; however, its role in continual learning has not been carefully studied. Here, we identified a two-layer neural circuit in the fruit fly olfactory system that performs continual associative learning between odors and their associated valences. In the first layer, inputs (odors) are encoded using sparse, high-dimensional representations, which reduces memory interference by activating nonoverlapping populations of neurons for different odors. In the second layer, only the synapses between odor-activated neurons and the odor's associated output neuron are modified during learning; the rest of the weights are frozen to prevent unrelated memories from being overwritten. We prove theoretically that these two perceptron-like layers help reduce catastrophic forgetting compared to the original perceptron algorithm, under continual learning. We then show empirically on benchmark data sets that this simple and lightweight architecture outperforms other popular neural-inspired algorithms when also using a two-layer feedforward architecture. Overall, fruit flies evolved an efficient continual associative learning algorithm, and circuit mechanisms from neuroscience can be translated to improve machine computation.
灾难性遗忘仍然是持续学习中的一个突出挑战。最近,受大脑启发的方法,如持续表征学习和记忆回放,已被用于对抗灾难性遗忘。联想学习(即使在学习到良好表征后仍保留输入与输出之间的关联)在大脑中起着重要作用;然而,其在持续学习中的作用尚未得到仔细研究。在这里,我们在果蝇嗅觉系统中识别出一个两层神经回路,它在气味及其相关效价之间执行持续联想学习。在第一层,输入(气味)使用稀疏的高维表征进行编码,这通过激活不同气味的不重叠神经元群体来减少记忆干扰。在第二层,学习期间仅修改气味激活神经元与气味相关输出神经元之间的突触;其余权重被冻结以防止无关记忆被覆盖。我们从理论上证明,与原始感知机算法相比,在持续学习情况下,这两个类似感知机的层有助于减少灾难性遗忘。然后,我们在基准数据集上通过实验表明,当也使用两层前馈架构时,这种简单且轻量级的架构优于其他流行的神经启发算法。总体而言,果蝇进化出了一种高效的持续联想学习算法,神经科学中的电路机制可以转化为改进机器计算。