• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于长期序列预测的集成编解码器分解变压器

Integrated codec decomposed Transformer for long-term series forecasting.

作者信息

Li Benhan, Zhang Wei, Lu Mingxin

机构信息

School of Information Management, Nanjing University, Nanjing, 210023, China.

School of Computer Science, Nanjing University of Posts and Telecommunications, Nanjing, 210023, China.

出版信息

Neural Netw. 2025 Aug;188:107484. doi: 10.1016/j.neunet.2025.107484. Epub 2025 Apr 23.

DOI:10.1016/j.neunet.2025.107484
PMID:40305989
Abstract

Recently, Transformer-based and multilayer perceptron (MLP) based architectures have formed a competitive landscape in the field of time series forecasting. There is evidence that series decomposition can further enhance the model's ability to perceive temporal patterns. However, most of the existing Transformer-based decomposed models capture seasonal features progressively and assist in adding trends for forecasting, but ignore the deep information contained in trends and may lead to pattern mismatch in the fusion stage. In addition, the permutation invariance of the attention mechanism inevitably leads to the loss of temporal order. After in-depth analysis of the applicability of attention and linear layers to series components, we propose to use attention to learn multivariate correlations from trends, and MLP to capture seasonal patterns. We further introduce an integrated codec that provides the same multivariate relationship representation for both the encoding and decoding stages, ensuring effective inheritance of temporal dependencies. To mitigate the fading of sequentiality during attention, we propose trend enhancement module, which maintains the stability of the trend by expanding the series to a longer time scale, helping the attention mechanism to achieve fine-grained feature representations. Extensive experiments show that our model exhibits state-of-the-art prediction performance on large-scale datasets.

摘要

最近,基于Transformer和基于多层感知器(MLP)的架构在时间序列预测领域形成了竞争态势。有证据表明,序列分解可以进一步增强模型感知时间模式的能力。然而,大多数现有的基于Transformer的分解模型逐步捕捉季节性特征并辅助添加趋势进行预测,但忽略了趋势中包含的深层信息,可能导致融合阶段的模式不匹配。此外,注意力机制的排列不变性不可避免地导致时间顺序的丢失。在深入分析注意力和线性层对序列组件的适用性后,我们建议使用注意力从趋势中学习多变量相关性,并使用MLP捕捉季节性模式。我们进一步引入了一种集成编解码器,它为编码和解码阶段提供相同的多变量关系表示,确保时间依赖性的有效继承。为了减轻注意力过程中序列性的衰减,我们提出了趋势增强模块,通过将序列扩展到更长的时间尺度来保持趋势的稳定性,帮助注意力机制实现细粒度特征表示。大量实验表明,我们的模型在大规模数据集上表现出了领先的预测性能。

相似文献

1
Integrated codec decomposed Transformer for long-term series forecasting.用于长期序列预测的集成编解码器分解变压器
Neural Netw. 2025 Aug;188:107484. doi: 10.1016/j.neunet.2025.107484. Epub 2025 Apr 23.
2
RFNet: Multivariate long sequence time-series forecasting based on recurrent representation and feature enhancement.RFNet:基于循环表示和特征增强的多变量长序列时间序列预测
Neural Netw. 2025 Jan;181:106800. doi: 10.1016/j.neunet.2024.106800. Epub 2024 Oct 23.
3
DyGraphformer: Transformer combining dynamic spatio-temporal graph network for multivariate time series forecasting.DyGraphformer:结合动态时空图网络的Transformer用于多变量时间序列预测。
Neural Netw. 2025 Jan;181:106776. doi: 10.1016/j.neunet.2024.106776. Epub 2024 Oct 17.
4
Multi-scale convolution enhanced transformer for multivariate long-term time series forecasting.多尺度卷积增强Transformer 用于多元长期时间序列预测。
Neural Netw. 2024 Dec;180:106745. doi: 10.1016/j.neunet.2024.106745. Epub 2024 Sep 23.
5
Spatial-Temporal Convolutional Transformer Network for Multivariate Time Series Forecasting.时空卷积转换器网络在多元时间序列预测中的应用。
Sensors (Basel). 2022 Jan 22;22(3):841. doi: 10.3390/s22030841.
6
TCDformer: A transformer framework for non-stationary time series forecasting based on trend and change-point detection.TCDformer:基于趋势和突变点检测的非平稳时间序列预测的 Transformer 框架。
Neural Netw. 2024 May;173:106196. doi: 10.1016/j.neunet.2024.106196. Epub 2024 Feb 23.
7
A Joint Time-Frequency Domain Transformer for multivariate time series forecasting.一种用于多变量时间序列预测的联合时频域转换器。
Neural Netw. 2024 Aug;176:106334. doi: 10.1016/j.neunet.2024.106334. Epub 2024 Apr 25.
8
GCN-Transformer: Graph Convolutional Network and Transformer for Multi-Person Pose Forecasting Using Sensor-Based Motion Data.GCN-Transformer:基于传感器的运动数据,用于多人姿态预测的图卷积网络和Transformer
Sensors (Basel). 2025 May 15;25(10):3136. doi: 10.3390/s25103136.
9
Long-term prediction for temporal propagation of seasonal influenza using Transformer-based model.基于 Transformer 的季节性流感时间传播的长期预测。
J Biomed Inform. 2021 Oct;122:103894. doi: 10.1016/j.jbi.2021.103894. Epub 2021 Aug 26.
10
MDWConv:CNN based on multi-scale atrous pyramid and depthwise separable convolution for long time series forecasting.MDWConv:基于多尺度空洞金字塔和深度可分离卷积的卷积神经网络用于长时间序列预测。
Neural Netw. 2025 May;185:107139. doi: 10.1016/j.neunet.2025.107139. Epub 2025 Jan 16.