Suppr超能文献

多层忆阻器神经网络中的高效自适应原位学习。

Efficient and self-adaptive in-situ learning in multilayer memristor neural networks.

机构信息

Department of Electrical and Computer Engineering, University of Massachusetts, Amherst, MA, 01003, USA.

Swarthmore College, Swarthmore, PA, 19081, USA.

出版信息

Nat Commun. 2018 Jun 19;9(1):2385. doi: 10.1038/s41467-018-04484-2.

Abstract

Memristors with tunable resistance states are emerging building blocks of artificial neural networks. However, in situ learning on a large-scale multiple-layer memristor network has yet to be demonstrated because of challenges in device property engineering and circuit integration. Here we monolithically integrate hafnium oxide-based memristors with a foundry-made transistor array into a multiple-layer neural network. We experimentally demonstrate in situ learning capability and achieve competitive classification accuracy on a standard machine learning dataset, which further confirms that the training algorithm allows the network to adapt to hardware imperfections. Our simulation using the experimental parameters suggests that a larger network would further increase the classification accuracy. The memristor neural network is a promising hardware platform for artificial intelligence with high speed-energy efficiency.

摘要

具有可调电阻状态的忆阻器是人工神经网络的新兴构建模块。然而,由于器件性能工程和电路集成方面的挑战,大规模多层忆阻器网络的原位学习尚未得到证明。在这里,我们将基于氧化铪的忆阻器与制造厂制造的晶体管阵列单片集成到一个多层神经网络中。我们通过实验证明了原位学习能力,并在标准机器学习数据集上实现了有竞争力的分类精度,这进一步证实了训练算法允许网络适应硬件缺陷。我们使用实验参数进行的模拟表明,更大的网络将进一步提高分类精度。忆阻器神经网络是一种具有高速能效的人工智能有前途的硬件平台。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aacc/6008303/b5abc1ea004e/41467_2018_4484_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验