Suppr超能文献

用于神经形态智能机器的非易失性存储材料。

Nonvolatile Memory Materials for Neuromorphic Intelligent Machines.

机构信息

Center for Electronic Materials, Korea Institute of Science and Technology, 5 Hwarang-ro 14-gil, Seongbuk-gu, Seoul, 02792, Republic of Korea.

Division of Materials Science and Engineering, Hanyang University, 222, Wangsimni-ro, Seongdong-gu, Seoul, 04763, South Korea.

出版信息

Adv Mater. 2018 Oct;30(42):e1704729. doi: 10.1002/adma.201704729. Epub 2018 Apr 18.

Abstract

Recent progress in deep learning extends the capability of artificial intelligence to various practical tasks, making the deep neural network (DNN) an extremely versatile hypothesis. While such DNN is virtually built on contemporary data centers of the von Neumann architecture, physical (in part) DNN of non-von Neumann architecture, also known as neuromorphic computing, can remarkably improve learning and inference efficiency. Particularly, resistance-based nonvolatile random access memory (NVRAM) highlights its handy and efficient application to the multiply-accumulate (MAC) operation in an analog manner. Here, an overview is given of the available types of resistance-based NVRAMs and their technological maturity from the material- and device-points of view. Examples within the strategy are subsequently addressed in comparison with their benchmarks (virtual DNN in deep learning). A spiking neural network (SNN) is another type of neural network that is more biologically plausible than the DNN. The successful incorporation of resistance-based NVRAM in SNN-based neuromorphic computing offers an efficient solution to the MAC operation and spike timing-based learning in nature. This strategy is exemplified from a material perspective. Intelligent machines are categorized according to their architecture and learning type. Also, the functionality and usefulness of NVRAM-based neuromorphic computing are addressed.

摘要

深度学习的最新进展扩展了人工智能在各种实际任务中的能力,使深度神经网络(DNN)成为一种极其通用的假设。虽然这种 DNN 实际上是建立在冯·诺依曼架构的当代数据中心之上的,但非冯·诺依曼架构的物理(部分)DNN,也称为神经形态计算,可以显著提高学习和推理效率。特别是基于电阻的非易失性随机存取存储器(NVRAM)以模拟方式突出了其在乘法-累加(MAC)操作中的便捷和高效应用。在这里,从材料和器件的角度概述了可用的基于电阻的 NVRAM 类型及其技术成熟度。随后在与基准(深度学习中的虚拟 DNN)的比较中讨论了该策略中的示例。尖峰神经网络(SNN)是另一种比 DNN 更具生物学意义的神经网络。基于电阻的 NVRAM 在基于 SNN 的神经形态计算中的成功应用为 MAC 操作和基于尖峰时间的自然学习提供了有效的解决方案。从材料的角度举例说明了这种策略。智能机器根据其架构和学习类型进行分类。还讨论了基于 NVRAM 的神经形态计算的功能和实用性。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验