Suppr超能文献

通过隐式非平衡记忆实现时间对比学习。

Temporal Contrastive Learning through implicit non-equilibrium memory.

作者信息

Falk Martin J, Strupp Adam T, Scellier Benjamin, Murugan Arvind

出版信息

ArXiv. 2025 Jan 29:arXiv:2312.17723v2.

Abstract

The backpropagation method has enabled transformative uses of neural networks. Alternatively, for energy-based models, local learning methods involving only nearby neurons offer benefits in terms of decentralized training, and allow for the possibility of learning in computationally-constrained substrates. One class of local learning methods constrasts the desired, clamped behavior with spontaneous, free behavior. However, directly contrasting free and clamped behaviors requires explicit memory. Here, we introduce `Temporal Contrastive Learning', an approach that uses integral feedback in each learning degree of freedom to provide a simple form of implicit non-equilibrium memory. During training, free and clamped behaviors are shown in a sawtooth-like protocol over time. When combined with integral feedback dynamics, these alternating temporal protocols generate an implicit memory necessary for comparing free and clamped behaviors, broadening the range of physical and biological systems capable of contrastive learning. Finally, we show that non-equilibrium dissipation improves learning quality and determine a Landauer-like energy cost of contrastive learning through physical dynamics.

摘要

反向传播方法使神经网络有了变革性的应用。另外,对于基于能量的模型,仅涉及附近神经元的局部学习方法在分散训练方面具有优势,并允许在计算受限的基质中进行学习。一类局部学习方法将期望的、钳位行为与自发的、自由行为进行对比。然而,直接对比自由行为和钳位行为需要显式记忆。在这里,我们引入“时间对比学习”,一种在每个学习自由度中使用积分反馈来提供简单形式的隐式非平衡记忆的方法。在训练期间,自由行为和钳位行为会随着时间以锯齿状协议呈现。当与积分反馈动力学相结合时,这些交替的时间协议会生成比较自由行为和钳位行为所需的隐式记忆,拓宽了能够进行对比学习的物理和生物系统的范围。最后,我们表明非平衡耗散提高了学习质量,并通过物理动力学确定了对比学习类似兰道尔的能量成本。

相似文献

2
Temporal Contrastive Learning through implicit non-equilibrium memory.
Nat Commun. 2025 Mar 4;16(1):2163. doi: 10.1038/s41467-025-57043-x.
3
Contrastive Hebbian learning with random feedback weights.
Neural Netw. 2019 Jun;114:1-14. doi: 10.1016/j.neunet.2019.01.008. Epub 2019 Feb 21.
4
Equivalence of backpropagation and contrastive Hebbian learning in a layered network.
Neural Comput. 2003 Feb;15(2):441-54. doi: 10.1162/089976603762552988.
5
Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation.
Front Comput Neurosci. 2017 May 4;11:24. doi: 10.3389/fncom.2017.00024. eCollection 2017.
6
Transferable Implicit Solvation via Contrastive Learning of Graph Neural Networks.
ACS Cent Sci. 2023 Nov 16;9(12):2286-2297. doi: 10.1021/acscentsci.3c01160. eCollection 2023 Dec 27.
7
Explicit and Implicit Feature Contrastive Learning Model for Knowledge Graph Link Prediction.
Sensors (Basel). 2024 Nov 18;24(22):7353. doi: 10.3390/s24227353.
8
Biologically Plausible Training Mechanisms for Self-Supervised Learning in Deep Networks.
Front Comput Neurosci. 2022 Mar 21;16:789253. doi: 10.3389/fncom.2022.789253. eCollection 2022.
9
Decentralized learning for medical image classification with prototypical contrastive network.
Med Phys. 2025 Jun;52(6):4188-4204. doi: 10.1002/mp.17753. Epub 2025 Mar 16.
10
STACoRe: Spatio-temporal and action-based contrastive representations for reinforcement learning in Atari.
Neural Netw. 2023 Mar;160:1-11. doi: 10.1016/j.neunet.2022.12.018. Epub 2022 Dec 29.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验