Suppr超能文献

大规模高效原位训练 CMOL CrossNet 分类器。

Scaling-efficient in-situ training of CMOL CrossNet classifiers.

机构信息

Department of Physics and Astronomy, Stony Brook University, Stony Brook, NY, USA.

出版信息

Neural Netw. 2011 Dec;24(10):1136-42. doi: 10.1016/j.neunet.2011.06.015. Epub 2011 Jul 1.

Abstract

CMOL CrossNets, hybrid CMOS/nanoelectronic neuromorphic circuits, may open up exciting opportunities to build artificial intelligence similar to the brain. However, limited functionality of nanodevices used in CMOL circuits causes significant challenges to train CrossNets with the usual algorithms. In order to overcome these challenges, we developed an in-situ variety of the error backpropagation method for supervised training of CrossNet-based pattern classifiers. Although this algorithm successfully trained CrossNets to perform simple benchmark classification tasks in Proben1, we found that it did not scale up to larger problems such as the MNIST dataset. Therefore, we propose an alternative in-situ method, combining training with the hidden layer build-up. Simulated results suggest that our new in-situ approach is appropriate to train CrossNets to perform classification on practical problems.

摘要

CMOL CrossNets,混合 CMOS/纳米电子神经形态电路,可能为构建类似于大脑的人工智能开辟令人兴奋的机会。然而,CMOL 电路中使用的纳米器件的功能有限,给使用常用算法对 CrossNets 进行训练带来了重大挑战。为了克服这些挑战,我们开发了一种用于基于 CrossNet 的模式分类器的监督训练的原位多种误差反向传播方法。尽管该算法成功地训练了 CrossNets 来执行 Proben1 中的简单基准分类任务,但我们发现它不能扩展到更大的问题,如 MNIST 数据集。因此,我们提出了一种替代的原位方法,将训练与隐藏层的建立结合起来。模拟结果表明,我们的新原位方法适合训练 CrossNets 以在实际问题上进行分类。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验