• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

多尺度卷积增强Transformer 用于多元长期时间序列预测。

Multi-scale convolution enhanced transformer for multivariate long-term time series forecasting.

机构信息

School of Software, Shandong University, Jinan 250101, China.

School of Software, Shandong University, Jinan 250101, China; Shandong Provincial Laboratory of Future Intelligence and Financial Engineering, Yantai 264005, China.

出版信息

Neural Netw. 2024 Dec;180:106745. doi: 10.1016/j.neunet.2024.106745. Epub 2024 Sep 23.

DOI:10.1016/j.neunet.2024.106745
PMID:39340967
Abstract

In data analysis and forecasting, particularly for multivariate long-term time series, challenges persist. The Transformer model in deep learning methods has shown significant potential in time series forecasting. The Transformer model's dot-product attention mechanism, however, due to its quadratic computational complexity, impairs training and forecasting efficiency. In addition, the Transformer architecture has limitations in modeling local features and dealing with multivariate cross-dimensional dependency relationship. In this article, a Multi-Scale Convolution Enhanced Transformer model (MSCformer) is proposed for multivariate long-term time series forecasting. As an alternative to modeling the time series in its entirety, a segmentation strategy is designed to convert the input original series into segmented forms with different lengths, then process time series segments using a new constructed multi-Dependency Aggregation module. This multi-Scale segmentation approach reduces the computational complexity of the attention mechanism part in subsequent models, and for each segment of length corresponds to a specific time scale, it also ensures that each segment retains the semantic information of the data sequence level, thereby comprehensively utilizing the multi-scale information of the data while more accurately capturing the real dependency of the time series. The Multi-Dependence Aggregate module captures both cross-temporal and cross-dimensional dependencies of multivariate long-term time series and compensates for local dependencies within the segments thereby captures local series features comprehensively and addressing the issue of insufficient information utilization. MSCformer synthesizes dependency information extracted from various temporal segments at different scales and reconstructs future series using linear layers. MSCformer exhibits higher forecasting accuracy, outperforming existing methods in multiple domains including energy, transportation, weather, electricity, disease and finance.

摘要

在数据分析和预测中,特别是对于多元长期时间序列,仍然存在挑战。深度学习方法中的 Transformer 模型在时间序列预测中显示出了巨大的潜力。然而,Transformer 模型的点积注意力机制由于其二次计算复杂度,影响了训练和预测效率。此外,Transformer 架构在建模局部特征和处理多元交叉维度依赖关系方面存在局限性。本文提出了一种用于多元长期时间序列预测的多尺度卷积增强 Transformer 模型(MSCformer)。作为对整个时间序列建模的替代方案,设计了一种分段策略,将输入原始序列转换为具有不同长度的分段形式,然后使用新构建的多依赖聚合模块处理时间序列分段。这种多尺度分段方法降低了后续模型中注意力机制部分的计算复杂度,并且对于每个长度对应的特定时间尺度,还确保了每个分段保留数据序列级别的语义信息,从而全面利用数据的多尺度信息,同时更准确地捕捉时间序列的真实依赖关系。多依赖聚合模块捕捉多元长期时间序列的跨时间和跨维度依赖关系,并补偿分段内的局部依赖关系,从而全面捕捉局部序列特征,并解决信息利用不足的问题。MSCformer 综合了从不同时间尺度的各个时间段提取的依赖信息,并使用线性层重建未来序列。MSCformer 表现出更高的预测精度,在能源、交通、天气、电力、疾病和金融等多个领域都优于现有方法。

相似文献

1
Multi-scale convolution enhanced transformer for multivariate long-term time series forecasting.多尺度卷积增强Transformer 用于多元长期时间序列预测。
Neural Netw. 2024 Dec;180:106745. doi: 10.1016/j.neunet.2024.106745. Epub 2024 Sep 23.
2
A Joint Time-Frequency Domain Transformer for multivariate time series forecasting.一种用于多变量时间序列预测的联合时频域转换器。
Neural Netw. 2024 Aug;176:106334. doi: 10.1016/j.neunet.2024.106334. Epub 2024 Apr 25.
3
DyGraphformer: Transformer combining dynamic spatio-temporal graph network for multivariate time series forecasting.DyGraphformer:结合动态时空图网络的Transformer用于多变量时间序列预测。
Neural Netw. 2025 Jan;181:106776. doi: 10.1016/j.neunet.2024.106776. Epub 2024 Oct 17.
4
Spatial-Temporal Convolutional Transformer Network for Multivariate Time Series Forecasting.时空卷积转换器网络在多元时间序列预测中的应用。
Sensors (Basel). 2022 Jan 22;22(3):841. doi: 10.3390/s22030841.
5
Spatial linear transformer and temporal convolution network for traffic flow prediction.用于交通流预测的空间线性变压器和时间卷积网络
Sci Rep. 2024 Feb 19;14(1):4040. doi: 10.1038/s41598-024-54114-9.
6
ETU-Net: edge enhancement-guided U-Net with transformer for skin lesion segmentation.ETU-Net:基于边缘增强引导的 U-Net 与 Transformer 的皮肤病变分割。
Phys Med Biol. 2023 Dec 22;69(1). doi: 10.1088/1361-6560/ad13d2.
7
A deep learning-based framework (Co-ReTr) for auto-segmentation of non-small cell-lung cancer in computed tomography images.一种基于深度学习的框架(Co-ReTr),用于在计算机断层扫描图像中对非小细胞肺癌进行自动分割。
J Appl Clin Med Phys. 2024 Mar;25(3):e14297. doi: 10.1002/acm2.14297. Epub 2024 Feb 19.
8
ETUNet:Exploring efficient transformer enhanced UNet for 3D brain tumor segmentation.ETUNet:探索高效的基于Transformer 的增强型 UNet 进行 3D 脑肿瘤分割。
Comput Biol Med. 2024 Mar;171:108005. doi: 10.1016/j.compbiomed.2024.108005. Epub 2024 Jan 23.
9
Developing a multivariate time series forecasting framework based on stacked autoencoders and multi-phase feature.基于堆叠自编码器和多阶段特征开发多元时间序列预测框架。
Heliyon. 2024 Mar 19;10(7):e27860. doi: 10.1016/j.heliyon.2024.e27860. eCollection 2024 Apr 15.
10
DiagSWin: A multi-scale vision transformer with diagonal-shaped windows for object detection and segmentation.DiagSWin:一种具有对角线形状窗口的多尺度视觉转换器,用于目标检测和分割。
Neural Netw. 2024 Dec;180:106653. doi: 10.1016/j.neunet.2024.106653. Epub 2024 Aug 22.