• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

经过训练以生成时空模式的递归脉冲神经网络中脉冲序列的拓扑特征。

Topological features of spike trains in recurrent spiking neural networks that are trained to generate spatiotemporal patterns.

作者信息

Maslennikov Oleg, Perc Matjaž, Nekorkin Vladimir

机构信息

Federal Research Center A.V. Gaponov-Grekhov Institute of Applied Physics of the Russian Academy of Sciences, Nizhny Novgorod, Russia.

Faculty of Natural Sciences and Mathematics, University of Maribor, Maribor, Slovenia.

出版信息

Front Comput Neurosci. 2024 Feb 23;18:1363514. doi: 10.3389/fncom.2024.1363514. eCollection 2024.

DOI:10.3389/fncom.2024.1363514
PMID:38463243
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10920356/
Abstract

In this study, we focus on training recurrent spiking neural networks to generate spatiotemporal patterns in the form of closed two-dimensional trajectories. Spike trains in the trained networks are examined in terms of their dissimilarity using the Victor-Purpura distance. We apply algebraic topology methods to the matrices obtained by rank-ordering the entries of the distance matrices, specifically calculating the persistence barcodes and Betti curves. By comparing the features of different types of output patterns, we uncover the complex relations between low-dimensional target signals and the underlying multidimensional spike trains.

摘要

在本研究中,我们专注于训练递归脉冲神经网络,以生成封闭二维轨迹形式的时空模式。使用Victor-Purpura距离,根据训练网络中的脉冲序列的差异对其进行检查。我们将代数拓扑方法应用于通过对距离矩阵的元素进行排序而获得的矩阵,具体计算持久条形码和贝蒂曲线。通过比较不同类型输出模式的特征,我们揭示了低维目标信号与底层多维脉冲序列之间的复杂关系。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/9e79d381d24a/fncom-18-1363514-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/679230fe7a9f/fncom-18-1363514-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/badfa387a5b8/fncom-18-1363514-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/e5573ad38596/fncom-18-1363514-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/8db86af2dc49/fncom-18-1363514-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/d740cad75326/fncom-18-1363514-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/92d7c444ccf0/fncom-18-1363514-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/9e79d381d24a/fncom-18-1363514-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/679230fe7a9f/fncom-18-1363514-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/badfa387a5b8/fncom-18-1363514-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/e5573ad38596/fncom-18-1363514-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/8db86af2dc49/fncom-18-1363514-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/d740cad75326/fncom-18-1363514-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/92d7c444ccf0/fncom-18-1363514-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/82b8/10920356/9e79d381d24a/fncom-18-1363514-g0007.jpg

相似文献

1
Topological features of spike trains in recurrent spiking neural networks that are trained to generate spatiotemporal patterns.经过训练以生成时空模式的递归脉冲神经网络中脉冲序列的拓扑特征。
Front Comput Neurosci. 2024 Feb 23;18:1363514. doi: 10.3389/fncom.2024.1363514. eCollection 2024.
2
Dynamic Spatiotemporal Pattern Recognition With Recurrent Spiking Neural Network.基于循环尖峰神经网络的动态时空模式识别。
Neural Comput. 2021 Oct 12;33(11):2971-2995. doi: 10.1162/neco_a_01432.
3
A Highly Effective and Robust Membrane Potential-Driven Supervised Learning Method for Spiking Neurons.一种高效稳健的基于膜电位的尖峰神经元监督学习方法。
IEEE Trans Neural Netw Learn Syst. 2019 Jan;30(1):123-137. doi: 10.1109/TNNLS.2018.2833077. Epub 2018 May 28.
4
Supervised Learning Algorithm for Multilayer Spiking Neural Networks with Long-Term Memory Spike Response Model.监督学习算法在具有长时记忆尖峰响应模型的多层尖峰神经网络中的应用。
Comput Intell Neurosci. 2021 Nov 24;2021:8592824. doi: 10.1155/2021/8592824. eCollection 2021.
5
Supervised learning in spiking neural networks: A review of algorithms and evaluations.监督学习在尖峰神经网络中的应用:算法和评估综述。
Neural Netw. 2020 May;125:258-280. doi: 10.1016/j.neunet.2020.02.011. Epub 2020 Feb 25.
6
A Spike Train Distance Robust to Firing Rate Changes Based on the Earth Mover's Distance.一种基于推土机距离且对 firing rate 变化具有鲁棒性的脉冲序列距离。 (注:这里“firing rate”直接保留英文未翻译,因为在医学专业领域可能是特定术语,若没有更多背景信息,准确翻译较难,保留英文更合适。)
Front Comput Neurosci. 2019 Dec 10;13:82. doi: 10.3389/fncom.2019.00082. eCollection 2019.
7
An Attention-Based Spiking Neural Network for Unsupervised Spike-Sorting.基于注意力的尖峰神经网络用于无监督尖峰分类。
Int J Neural Syst. 2019 Oct;29(8):1850059. doi: 10.1142/S0129065718500594. Epub 2018 Dec 27.
8
Pre-Synaptic Pool Modification (PSPM): A supervised learning procedure for recurrent spiking neural networks.前突触池修饰(PSPM):递归尖峰神经网络的监督学习过程。
PLoS One. 2020 Feb 24;15(2):e0229083. doi: 10.1371/journal.pone.0229083. eCollection 2020.
9
A Supervised Learning Algorithm for Learning Precise Timing of Multiple Spikes in Multilayer Spiking Neural Networks.一种用于学习多层脉冲神经网络中多个脉冲精确时间的监督学习算法。
IEEE Trans Neural Netw Learn Syst. 2018 Nov;29(11):5394-5407. doi: 10.1109/TNNLS.2018.2797801. Epub 2018 Mar 1.
10
Internal dynamics of recurrent neural networks trained to generate complex spatiotemporal patterns.训练用于生成复杂时空模式的循环神经网络的内部动力学。
Chaos. 2023 Sep 1;33(9). doi: 10.1063/5.0166359.

本文引用的文献

1
Opportunities for neuromorphic computing algorithms and applications.神经形态计算算法与应用的机遇。
Nat Comput Sci. 2022 Jan;2(1):10-19. doi: 10.1038/s43588-021-00184-y. Epub 2022 Jan 31.
2
BrainCog: A spiking neural network based, brain-inspired cognitive intelligence engine for brain-inspired AI and brain simulation.BrainCog:一种基于脉冲神经网络的、受大脑启发的认知智能引擎,用于受大脑启发的人工智能和大脑模拟。
Patterns (N Y). 2023 Jul 6;4(8):100789. doi: 10.1016/j.patter.2023.100789. eCollection 2023 Aug 11.
3
Geometry of population activity in spiking networks with low-rank structure.
具有低秩结构的尖峰网络中群体活动的几何结构。
PLoS Comput Biol. 2023 Aug 7;19(8):e1011315. doi: 10.1371/journal.pcbi.1011315. eCollection 2023 Aug.
4
A Hippocampal-Entorhinal Cortex Neuronal Network for Dynamical Mechanisms of Epileptic Seizure.用于癫痫发作动态机制的海马-内嗅皮层神经元网络。
IEEE Trans Neural Syst Rehabil Eng. 2023;31:1986-1996. doi: 10.1109/TNSRE.2023.3265581. Epub 2023 Apr 12.
5
Multitask computation through dynamics in recurrent spiking neural networks.通过循环脉冲神经网络中的动力学进行多任务计算。
Sci Rep. 2023 Mar 10;13(1):3997. doi: 10.1038/s41598-023-31110-z.
6
The centrality of population-level factors to network computation is demonstrated by a versatile approach for training spiking networks.人口水平因素在网络计算中的核心地位,通过一种通用的训练尖峰网络的方法得到了证明。
Neuron. 2023 Mar 1;111(5):631-649.e10. doi: 10.1016/j.neuron.2022.12.007. Epub 2023 Jan 10.
7
Geometry of spiking patterns in early visual cortex: a topological data analytic approach.早期视觉皮层中尖峰模式的几何形状:一种拓扑数据分析方法。
J R Soc Interface. 2022 Nov;19(196):20220677. doi: 10.1098/rsif.2022.0677. Epub 2022 Nov 16.
8
Generative Models of Brain Dynamics.脑动力学的生成模型
Front Artif Intell. 2022 Jul 15;5:807406. doi: 10.3389/frai.2022.807406. eCollection 2022.
9
Spiking Neural Networks and Their Applications: A Review.脉冲神经网络及其应用:综述
Brain Sci. 2022 Jun 30;12(7):863. doi: 10.3390/brainsci12070863.
10
The role of population structure in computations through neural dynamics.人口结构在神经动力学计算中的作用。
Nat Neurosci. 2022 Jun;25(6):783-794. doi: 10.1038/s41593-022-01088-4. Epub 2022 Jun 6.