Suppr超能文献

多尺度卷积增强Transformer 用于多元长期时间序列预测。

Multi-scale convolution enhanced transformer for multivariate long-term time series forecasting.

机构信息

School of Software, Shandong University, Jinan 250101, China.

School of Software, Shandong University, Jinan 250101, China; Shandong Provincial Laboratory of Future Intelligence and Financial Engineering, Yantai 264005, China.

出版信息

Neural Netw. 2024 Dec;180:106745. doi: 10.1016/j.neunet.2024.106745. Epub 2024 Sep 23.

Abstract

In data analysis and forecasting, particularly for multivariate long-term time series, challenges persist. The Transformer model in deep learning methods has shown significant potential in time series forecasting. The Transformer model's dot-product attention mechanism, however, due to its quadratic computational complexity, impairs training and forecasting efficiency. In addition, the Transformer architecture has limitations in modeling local features and dealing with multivariate cross-dimensional dependency relationship. In this article, a Multi-Scale Convolution Enhanced Transformer model (MSCformer) is proposed for multivariate long-term time series forecasting. As an alternative to modeling the time series in its entirety, a segmentation strategy is designed to convert the input original series into segmented forms with different lengths, then process time series segments using a new constructed multi-Dependency Aggregation module. This multi-Scale segmentation approach reduces the computational complexity of the attention mechanism part in subsequent models, and for each segment of length corresponds to a specific time scale, it also ensures that each segment retains the semantic information of the data sequence level, thereby comprehensively utilizing the multi-scale information of the data while more accurately capturing the real dependency of the time series. The Multi-Dependence Aggregate module captures both cross-temporal and cross-dimensional dependencies of multivariate long-term time series and compensates for local dependencies within the segments thereby captures local series features comprehensively and addressing the issue of insufficient information utilization. MSCformer synthesizes dependency information extracted from various temporal segments at different scales and reconstructs future series using linear layers. MSCformer exhibits higher forecasting accuracy, outperforming existing methods in multiple domains including energy, transportation, weather, electricity, disease and finance.

摘要

在数据分析和预测中,特别是对于多元长期时间序列,仍然存在挑战。深度学习方法中的 Transformer 模型在时间序列预测中显示出了巨大的潜力。然而,Transformer 模型的点积注意力机制由于其二次计算复杂度,影响了训练和预测效率。此外,Transformer 架构在建模局部特征和处理多元交叉维度依赖关系方面存在局限性。本文提出了一种用于多元长期时间序列预测的多尺度卷积增强 Transformer 模型(MSCformer)。作为对整个时间序列建模的替代方案,设计了一种分段策略,将输入原始序列转换为具有不同长度的分段形式,然后使用新构建的多依赖聚合模块处理时间序列分段。这种多尺度分段方法降低了后续模型中注意力机制部分的计算复杂度,并且对于每个长度对应的特定时间尺度,还确保了每个分段保留数据序列级别的语义信息,从而全面利用数据的多尺度信息,同时更准确地捕捉时间序列的真实依赖关系。多依赖聚合模块捕捉多元长期时间序列的跨时间和跨维度依赖关系,并补偿分段内的局部依赖关系,从而全面捕捉局部序列特征,并解决信息利用不足的问题。MSCformer 综合了从不同时间尺度的各个时间段提取的依赖信息,并使用线性层重建未来序列。MSCformer 表现出更高的预测精度,在能源、交通、天气、电力、疾病和金融等多个领域都优于现有方法。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验