• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

突发与内存感知变换器:捕捉时间异质性。

Burst and Memory-aware Transformer: capturing temporal heterogeneity.

作者信息

Lee Byounghwa, Lee Jung-Hoon, Lee Sungyup, Kim Cheol Ho

机构信息

CybreBrain Research Section, Electronics and Telecommunications Research Institute, Daejeon, Republic of Korea.

出版信息

Front Comput Neurosci. 2023 Dec 12;17:1292842. doi: 10.3389/fncom.2023.1292842. eCollection 2023.

DOI:10.3389/fncom.2023.1292842
PMID:38148765
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10749928/
Abstract

Burst patterns, characterized by their temporal heterogeneity, have been observed across a wide range of domains, encompassing event sequences from neuronal firing to various facets of human activities. Recent research on predicting event sequences leveraged a Transformer based on the Hawkes process, incorporating a self-attention mechanism to capture long-term temporal dependencies. To effectively handle bursty temporal patterns, we propose a Burst and Memory-aware Transformer (BMT) model, designed to explicitly address temporal heterogeneity. The BMT model embeds the burstiness and memory coefficient into the self-attention module, enhancing the learning process with insights derived from the bursty patterns. Furthermore, we employed a novel loss function designed to optimize the burstiness and memory coefficient values, as well as their corresponding discretized one-hot vectors, both individually and jointly. Numerical experiments conducted on diverse synthetic and real-world datasets demonstrated the outstanding performance of the BMT model in terms of accurately predicting event times and intensity functions compared to existing models and control groups. In particular, the BMT model exhibits remarkable performance for temporally heterogeneous data, such as those with power-law inter-event time distributions. Our findings suggest that the incorporation of burst-related parameters assists the Transformer in comprehending heterogeneous event sequences, leading to an enhanced predictive performance.

摘要

爆发模式以其时间异质性为特征,已在广泛的领域中被观察到,涵盖从神经元放电到人类活动各个方面的事件序列。最近关于预测事件序列的研究利用了基于霍克斯过程的Transformer,纳入了自注意力机制以捕捉长期时间依赖性。为了有效处理突发的时间模式,我们提出了一种突发和记忆感知Transformer(BMT)模型,旨在明确解决时间异质性问题。BMT模型将突发性和记忆系数嵌入自注意力模块,利用从突发模式中获得的见解增强学习过程。此外,我们采用了一种新颖的损失函数,旨在分别和联合优化突发性和记忆系数值及其相应的离散化独热向量。在各种合成和真实世界数据集上进行的数值实验表明,与现有模型和对照组相比,BMT模型在准确预测事件时间和强度函数方面表现出色。特别是,BMT模型对于具有时间异质性的数据,如具有幂律事件间时间分布的数据,表现出卓越的性能。我们的研究结果表明,纳入与突发相关的参数有助于Transformer理解异质事件序列,从而提高预测性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/6aad4e78afe5/fncom-17-1292842-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/41c6c23ff66f/fncom-17-1292842-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/ebcdeb57068a/fncom-17-1292842-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/8a89dc321266/fncom-17-1292842-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/5184451f34fc/fncom-17-1292842-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/a54778b819b6/fncom-17-1292842-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/6aad4e78afe5/fncom-17-1292842-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/41c6c23ff66f/fncom-17-1292842-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/ebcdeb57068a/fncom-17-1292842-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/8a89dc321266/fncom-17-1292842-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/5184451f34fc/fncom-17-1292842-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/a54778b819b6/fncom-17-1292842-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb8f/10749928/6aad4e78afe5/fncom-17-1292842-g0006.jpg

相似文献

1
Burst and Memory-aware Transformer: capturing temporal heterogeneity.突发与内存感知变换器:捕捉时间异质性。
Front Comput Neurosci. 2023 Dec 12;17:1292842. doi: 10.3389/fncom.2023.1292842. eCollection 2023.
2
A novel hybrid framework based on temporal convolution network and transformer for network traffic prediction.基于时间卷积网络和转换器的新型混合框架用于网络流量预测。
PLoS One. 2023 Sep 8;18(9):e0288935. doi: 10.1371/journal.pone.0288935. eCollection 2023.
3
Spatial-Temporal Convolutional Transformer Network for Multivariate Time Series Forecasting.时空卷积转换器网络在多元时间序列预测中的应用。
Sensors (Basel). 2022 Jan 22;22(3):841. doi: 10.3390/s22030841.
4
Burst-tree decomposition of time series reveals the structure of temporal correlations.时间序列的突发树分解揭示了时间相关性的结构。
Sci Rep. 2020 Jul 22;10(1):12202. doi: 10.1038/s41598-020-68157-1.
5
MR-Transformer: Multiresolution Transformer for Multivariate Time Series Prediction.MR-Transformer:用于多元时间序列预测的多分辨率Transformer
IEEE Trans Neural Netw Learn Syst. 2025 Jan;36(1):1171-1183. doi: 10.1109/TNNLS.2023.3327416. Epub 2025 Jan 7.
6
Limits of the memory coefficient in measuring correlated bursts.测量相关爆发时记忆系数的局限性。
Phys Rev E. 2018 Mar;97(3-1):032121. doi: 10.1103/PhysRevE.97.032121.
7
Deep transformer-based heterogeneous spatiotemporal graph learning for geographical traffic forecasting.基于深度Transformer的异构时空图学习用于地理交通预测
iScience. 2024 Jun 25;27(7):110175. doi: 10.1016/j.isci.2024.110175. eCollection 2024 Jul 19.
8
Unsupervised Low-Light Video Enhancement With Spatial-Temporal Co-Attention Transformer.基于时空协同注意力Transformer的无监督低光照视频增强
IEEE Trans Image Process. 2023;32:4701-4715. doi: 10.1109/TIP.2023.3301332. Epub 2023 Aug 16.
9
Disentangled Dynamic Deviation Transformer Networks for Multivariate Time Series Anomaly Detection.解缠动态偏差变换网络在多元时间序列异常检测中的应用。
Sensors (Basel). 2023 Jan 18;23(3):1104. doi: 10.3390/s23031104.
10
Improved deep learning image classification algorithm based on Swin Transformer V2.基于Swin Transformer V2的改进型深度学习图像分类算法。
PeerJ Comput Sci. 2023 Oct 30;9:e1665. doi: 10.7717/peerj-cs.1665. eCollection 2023.

本文引用的文献

1
An N400 identification method based on the combination of Soft-DTW and transformer.一种基于Soft-DTW与变压器相结合的N400识别方法。
Front Comput Neurosci. 2023 Feb 16;17:1120566. doi: 10.3389/fncom.2023.1120566. eCollection 2023.
2
Copula-based algorithm for generating bursty time series.基于 Copula 的突发时间序列生成算法。
Phys Rev E. 2019 Aug;100(2-1):022307. doi: 10.1103/PhysRevE.100.022307.
3
Neural Coding With Bursts-Current State and Future Perspectives.具有爆发的神经编码——当前状态与未来展望
Front Comput Neurosci. 2018 Jul 6;12:48. doi: 10.3389/fncom.2018.00048. eCollection 2018.
4
Propensity for Bistability of Bursting and Silence in the Leech Heart Interneuron.水蛭心脏中间神经元爆发式放电和静息双稳态的倾向
Front Comput Neurosci. 2018 Feb 6;12:5. doi: 10.3389/fncom.2018.00005. eCollection 2018.
5
Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.连续峰电位时间相关性影响联合峰电位事件的概率分布。
Front Comput Neurosci. 2016 Dec 23;10:139. doi: 10.3389/fncom.2016.00139. eCollection 2016.
6
Measuring burstiness for finite event sequences.测量有限事件序列的突发性。
Phys Rev E. 2016 Sep;94(3-1):032311. doi: 10.1103/PhysRevE.94.032311. Epub 2016 Sep 15.
7
Burst Firing in the Electrosensory System of Gymnotiform Weakly Electric Fish: Mechanisms and Functional Roles.裸背电鳗目弱电鱼电感觉系统中的爆发式放电:机制与功能作用
Front Comput Neurosci. 2016 Aug 2;10:81. doi: 10.3389/fncom.2016.00081. eCollection 2016.
8
Burst Firing Enhances Neural Output Correlation.爆发式放电增强神经输出相关性。
Front Comput Neurosci. 2016 May 9;10:42. doi: 10.3389/fncom.2016.00042. eCollection 2016.
9
Contextual analysis framework for bursty dynamics.突发动态的情境分析框架
Phys Rev E Stat Nonlin Soft Matter Phys. 2013 Jun;87(6):062131. doi: 10.1103/PhysRevE.87.062131. Epub 2013 Jun 20.
10
Elemental spiking neuron model for reproducing diverse firing patterns and predicting precise firing times.用于再现多样化发放模式和预测精确发放时间的元素尖峰神经元模型。
Front Comput Neurosci. 2011 Oct 4;5:42. doi: 10.3389/fncom.2011.00042. eCollection 2011.