Suppr超能文献

SWsnn:一种新型尖峰神经网络模拟器。

SWsnn: A Novel Simulator for Spiking Neural Networks.

机构信息

Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China.

Southern University of Science and Technology, Shenzhen, China.

出版信息

J Comput Biol. 2023 Sep;30(9):951-960. doi: 10.1089/cmb.2023.0098. Epub 2023 Aug 16.

Abstract

Spiking neural network (SNN) simulators play an important role in neural system modeling and brain function research. They can help scientists reproduce and explore neuronal activities in brain regions, neuroscience, brain-like computing, and other fields and can also be applied to artificial intelligence, machine learning, and other fields. At present, many simulators using central processing unit (CPU) or graphics processing unit (GPU) have been developed. However, due to the randomness of connections between neurons and spiking events in SNN simulation, this causes a lot of memory access time. To alleviate this problem, we developed an SNN simulator SWsnn based on the new Sunway SW26010pro processor. The SW26010pro processor consists of six core groups, each with 16 MB of local data memory (LDM). LDM has the characteristics of high-speed read and write, which is suitable for performing simulation tasks similar to SNNs. Experimental results show that SWsnn runs faster than other mainstream GPU-based simulators when simulating a certain scale of neural network, showing a strong performance advantage. To conduct larger scale simulations, SWsnn designed a simulation computation based on a large shared model of Sunway processor and developed a multiprocessor version of SWsnn based on this mode, achieving larger scale SNN simulations.

摘要

尖峰神经网络 (SNN) 模拟器在神经系统建模和大脑功能研究中起着重要作用。它们可以帮助科学家重现和探索大脑区域、神经科学、类脑计算等领域的神经元活动,也可以应用于人工智能、机器学习等领域。目前,已经开发了许多使用中央处理器 (CPU) 或图形处理器 (GPU) 的模拟器。然而,由于 SNN 模拟中神经元之间连接和尖峰事件的随机性,这导致了大量的内存访问时间。为了解决这个问题,我们基于新型 Sunway SW26010pro 处理器开发了一个 SNN 模拟器 SWsnn。SW26010pro 处理器由六个核组组成,每个核组都有 16MB 的本地数据内存 (LDM)。LDM 具有高速读写的特点,非常适合执行类似于 SNN 的模拟任务。实验结果表明,在模拟一定规模的神经网络时,SWsnn 比其他主流基于 GPU 的模拟器运行速度更快,表现出很强的性能优势。为了进行更大规模的模拟,SWsnn 基于 Sunway 处理器的大型共享模型设计了一种模拟计算,并在此基础上开发了一个多处理器版本的 SWsnn,实现了更大规模的 SNN 模拟。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验