• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种具有内在梯度下降自适应的自学习磁霍普菲尔德神经网络。

A self-learning magnetic Hopfield neural network with intrinsic gradient descent adaption.

作者信息

Niu Chang, Zhang Huanyu, Xu Chuanlong, Hu Wenjie, Wu Yunzhuo, Wu Yu, Wang Yadi, Wu Tong, Zhu Yi, Zhu Yinyan, Wang Wenbin, Wu Yizheng, Yin Lifeng, Xiao Jiang, Yu Weichao, Guo Hangwen, Shen Jian

机构信息

State Key Laboratory of Surface Physics and Institute for Nanoelectronic Devices and Quantum Computing, Fudan University, Shanghai 200433, China.

Department of Physics, Fudan University, Shanghai 200433, China.

出版信息

Proc Natl Acad Sci U S A. 2024 Dec 17;121(51):e2416294121. doi: 10.1073/pnas.2416294121. Epub 2024 Dec 13.

DOI:10.1073/pnas.2416294121
PMID:39671188
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11665918/
Abstract

Physical neural networks (PNN) using physical materials and devices to mimic synapses and neurons offer an energy-efficient way to implement artificial neural networks. Yet, training PNN is difficult and heavily relies on external computing resources. An emerging concept to solve this issue is called physical self-learning that uses intrinsic physical parameters as trainable weights. Under external inputs (i.e., training data), training is achieved by the natural evolution of physical parameters that intrinsically adapt modern learning rules via an autonomous physical process, eliminating the requirements on external computation resources. Here, we demonstrate a real spintronic system that mimics Hopfield neural networks (HNN), and unsupervised learning is intrinsically performed via the evolution of the physical process. Using magnetic texture-defined conductance matrix as trainable weights, we illustrate that under external voltage inputs, the conductance matrix naturally evolves and adapts Oja's learning algorithm in a gradient descent manner. The self-learning HNN is scalable and can achieve associative memories on patterns with high similarities. The fast spin dynamics and reconfigurability of magnetic textures offer an advantageous platform toward efficient autonomous training directly in materials.

摘要

使用物理材料和器件来模拟突触和神经元的物理神经网络(PNN)为实现人工神经网络提供了一种节能方式。然而,训练PNN很困难且严重依赖外部计算资源。一个解决该问题的新兴概念称为物理自学习,它使用固有物理参数作为可训练权重。在外部输入(即训练数据)下,通过物理参数的自然演化来实现训练,这些物理参数通过自主物理过程内在地适应现代学习规则,从而消除了对外部计算资源的需求。在此,我们展示了一个模拟霍普菲尔德神经网络(HNN)的真实自旋电子系统,并且无监督学习通过物理过程的演化内在地进行。使用由磁纹理定义的电导矩阵作为可训练权重,我们表明在外部电压输入下,电导矩阵自然演化并以梯度下降方式适应奥雅学习算法。自学习HNN具有可扩展性,并且可以在具有高度相似性的模式上实现关联记忆。磁纹理的快速自旋动力学和可重构性为直接在材料中进行高效自主训练提供了一个有利平台。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3fd8/11665918/88b3e8dc149a/pnas.2416294121fig05.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3fd8/11665918/44027876955a/pnas.2416294121fig01.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3fd8/11665918/7e854e07a53f/pnas.2416294121fig02.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3fd8/11665918/27aef5e2614f/pnas.2416294121fig03.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3fd8/11665918/b2624d702546/pnas.2416294121fig04.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3fd8/11665918/88b3e8dc149a/pnas.2416294121fig05.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3fd8/11665918/44027876955a/pnas.2416294121fig01.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3fd8/11665918/7e854e07a53f/pnas.2416294121fig02.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3fd8/11665918/27aef5e2614f/pnas.2416294121fig03.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3fd8/11665918/b2624d702546/pnas.2416294121fig04.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3fd8/11665918/88b3e8dc149a/pnas.2416294121fig05.jpg

相似文献

1
A self-learning magnetic Hopfield neural network with intrinsic gradient descent adaption.一种具有内在梯度下降自适应的自学习磁霍普菲尔德神经网络。
Proc Natl Acad Sci U S A. 2024 Dec 17;121(51):e2416294121. doi: 10.1073/pnas.2416294121. Epub 2024 Dec 13.
2
Memristors for Neuromorphic Circuits and Artificial Intelligence Applications.用于神经形态电路和人工智能应用的忆阻器
Materials (Basel). 2020 Feb 20;13(4):938. doi: 10.3390/ma13040938.
3
Hopfield Neural Network Flow: A Geometric Viewpoint.Hopfield 神经网络流:一种几何视角。
IEEE Trans Neural Netw Learn Syst. 2020 Nov;31(11):4869-4880. doi: 10.1109/TNNLS.2019.2958556. Epub 2020 Oct 29.
4
Robust Exponential Memory in Hopfield Networks.霍普菲尔德网络中的稳健指数记忆
J Math Neurosci. 2018 Jan 16;8(1):1. doi: 10.1186/s13408-017-0056-2.
5
Accelerating the training of feedforward neural networks using generalized Hebbian rules for initializing the internal representations.使用广义赫布规则初始化内部表示以加速前馈神经网络的训练。
IEEE Trans Neural Netw. 1996;7(2):419-26. doi: 10.1109/72.485677.
6
SpikePropamine: Differentiable Plasticity in Spiking Neural Networks.尖峰传播胺:脉冲神经网络中的可微可塑性。
Front Neurorobot. 2021 Sep 22;15:629210. doi: 10.3389/fnbot.2021.629210. eCollection 2021.
7
Non-linear Memristive Synaptic Dynamics for Efficient Unsupervised Learning in Spiking Neural Networks.用于脉冲神经网络中高效无监督学习的非线性忆阻突触动力学
Front Neurosci. 2021 Feb 1;15:580909. doi: 10.3389/fnins.2021.580909. eCollection 2021.
8
Oscillatory neural network learning for pattern recognition: an on-chip learning perspective and implementation.用于模式识别的振荡神经网络学习:片上学习视角与实现
Front Neurosci. 2023 Jun 15;17:1196796. doi: 10.3389/fnins.2023.1196796. eCollection 2023.
9
Common nature of learning between back-propagation and Hopfield-type neural networks for generalized matrix inversion with simplified models.反向传播和 Hopfield 型神经网络在广义矩阵求逆中的学习共性及其简化模型。
IEEE Trans Neural Netw Learn Syst. 2013 Apr;24(4):579-92. doi: 10.1109/TNNLS.2013.2238555.
10
Learning algorithms for oscillatory neural networks as associative memory for pattern recognition.用于振荡神经网络的学习算法,作为模式识别的关联记忆。
Front Neurosci. 2023 Nov 29;17:1257611. doi: 10.3389/fnins.2023.1257611. eCollection 2023.

本文引用的文献

1
Fully forward mode training for optical neural networks.全前向模式训练的光神经网络。
Nature. 2024 Aug;632(8024):280-286. doi: 10.1038/s41586-024-07687-4. Epub 2024 Aug 7.
2
Machine learning without a processor: Emergent learning in a nonlinear analog network.无处理器的机器学习:非线性模拟网络中的涌现学习。
Proc Natl Acad Sci U S A. 2024 Jul 9;121(28):e2319718121. doi: 10.1073/pnas.2319718121. Epub 2024 Jul 2.
3
Backpropagation-free training of deep physical neural networks.深度物理神经网络的无反向传播训练
Science. 2023 Dec 15;382(6676):1297-1303. doi: 10.1126/science.adi8474. Epub 2023 Nov 23.
4
Multilayer spintronic neural networks with radiofrequency connections.具有射频连接的多层自旋电子神经网络。
Nat Nanotechnol. 2023 Nov;18(11):1273-1280. doi: 10.1038/s41565-023-01452-w. Epub 2023 Jul 27.
5
Experimental demonstration of a skyrmion-enhanced strain-mediated physical reservoir computing system.实验演示了一种基于斯格明子增强的应变介导的物理存储计算系统。
Nat Commun. 2023 Jun 10;14(1):3434. doi: 10.1038/s41467-023-39207-9.
6
Distinguishing artificial spin ice states using magnetoresistance effect for neuromorphic computing.利用磁电阻效应区分人工自旋冰态用于神经形态计算。
Nat Commun. 2023 May 4;14(1):2562. doi: 10.1038/s41467-023-38286-y.
7
Neuromorphic learning, working memory, and metaplasticity in nanowire networks.纳米线网络中的神经形态学习、工作记忆和超塑性。
Sci Adv. 2023 Apr 21;9(16):eadg3289. doi: 10.1126/sciadv.adg3289.
8
Photonic online learning: a perspective.光子在线学习:一种观点。
Nanophotonics. 2023 Jan 9;12(5):833-845. doi: 10.1515/nanoph-2022-0553. eCollection 2023 Mar.
9
Memory Formation in Adaptive Networks.自适应网络中的记忆形成。
Phys Rev Lett. 2022 Jul 8;129(2):028101. doi: 10.1103/PhysRevLett.129.028101.
10
Learning by mistakes in memristor networks.忆阻器网络中的错误学习
Phys Rev E. 2022 May;105(5-1):054306. doi: 10.1103/PhysRevE.105.054306.