• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

RFNet:基于循环表示和特征增强的多变量长序列时间序列预测

RFNet: Multivariate long sequence time-series forecasting based on recurrent representation and feature enhancement.

作者信息

Zhang Dandan, Zhang Zhiqiang, Chen Nanguang, Wang Yun

机构信息

School of Computer Science and Engineering, Southeast University, Nanjing, China.

College of Design and Engineering, National University of Singapore, Singapore.

出版信息

Neural Netw. 2025 Jan;181:106800. doi: 10.1016/j.neunet.2024.106800. Epub 2024 Oct 23.

DOI:10.1016/j.neunet.2024.106800
PMID:39488111
Abstract

Multivariate time series exhibit complex patterns and structures involving interactions among multiple variables and long-term temporal dependencies, making multivariate long sequence time series forecasting (MLSTF) exceptionally challenging. Despite significant progress in Transformer-based methods in the MLSTF domain, many models still rely on stacked encoder-decoder architectures to capture complex time series patterns. This leads to increased computational complexity and overlooks spatial pattern information in multivariate time series, thereby limiting the model's performance. To address these challenges, we propose RFNet, a lightweight model based on recurrent representation and feature enhancement. We partition the time series into fixed-size subsequences to retain local contextual temporal pattern information and cross-variable spatial pattern information. The recurrent representation module employs gate attention mechanisms and memory units to capture local information of the subsequences and obtain long-term correlation information of the input sequence by integrating information from different memory units. Meanwhile, we utilize a shared multi-layer perceptron (MLP) to capture global pattern information of the input sequence. The feature enhancement module explicitly extracts complex spatial patterns in the time series by transforming the input sequence. We validate the performance of RFNet on ten real-world datasets. The results demonstrate an improvement of approximately 55.3% over state-of-the-art MLSTF models, highlighting its significant advantage in addressing multivariate long sequence time series forecasting problems.

摘要

多变量时间序列呈现出复杂的模式和结构,涉及多个变量之间的相互作用以及长期的时间依赖性,这使得多变量长序列时间序列预测(MLSTF)极具挑战性。尽管基于Transformer的方法在MLSTF领域取得了显著进展,但许多模型仍依赖堆叠的编码器-解码器架构来捕捉复杂的时间序列模式。这导致计算复杂度增加,并忽略了多变量时间序列中的空间模式信息,从而限制了模型的性能。为应对这些挑战,我们提出了RFNet,一种基于循环表示和特征增强的轻量级模型。我们将时间序列划分为固定大小的子序列,以保留局部上下文时间模式信息和跨变量空间模式信息。循环表示模块采用门控注意力机制和记忆单元来捕捉子序列的局部信息,并通过整合来自不同记忆单元的信息获得输入序列的长期相关信息。同时,我们利用共享的多层感知器(MLP)来捕捉输入序列的全局模式信息。特征增强模块通过变换输入序列来显式提取时间序列中的复杂空间模式。我们在十个真实世界数据集上验证了RFNet的性能。结果表明,与现有最先进的MLSTF模型相比,性能提高了约55.3%,突出了其在解决多变量长序列时间序列预测问题方面的显著优势。

相似文献

1
RFNet: Multivariate long sequence time-series forecasting based on recurrent representation and feature enhancement.RFNet:基于循环表示和特征增强的多变量长序列时间序列预测
Neural Netw. 2025 Jan;181:106800. doi: 10.1016/j.neunet.2024.106800. Epub 2024 Oct 23.
2
DyGraphformer: Transformer combining dynamic spatio-temporal graph network for multivariate time series forecasting.DyGraphformer:结合动态时空图网络的Transformer用于多变量时间序列预测。
Neural Netw. 2025 Jan;181:106776. doi: 10.1016/j.neunet.2024.106776. Epub 2024 Oct 17.
3
Multi-scale convolution enhanced transformer for multivariate long-term time series forecasting.多尺度卷积增强Transformer 用于多元长期时间序列预测。
Neural Netw. 2024 Dec;180:106745. doi: 10.1016/j.neunet.2024.106745. Epub 2024 Sep 23.
4
A Joint Time-Frequency Domain Transformer for multivariate time series forecasting.一种用于多变量时间序列预测的联合时频域转换器。
Neural Netw. 2024 Aug;176:106334. doi: 10.1016/j.neunet.2024.106334. Epub 2024 Apr 25.
5
Integrated codec decomposed Transformer for long-term series forecasting.用于长期序列预测的集成编解码器分解变压器
Neural Netw. 2025 Aug;188:107484. doi: 10.1016/j.neunet.2025.107484. Epub 2025 Apr 23.
6
Spatial-Temporal Convolutional Transformer Network for Multivariate Time Series Forecasting.时空卷积转换器网络在多元时间序列预测中的应用。
Sensors (Basel). 2022 Jan 22;22(3):841. doi: 10.3390/s22030841.
7
MDWConv:CNN based on multi-scale atrous pyramid and depthwise separable convolution for long time series forecasting.MDWConv:基于多尺度空洞金字塔和深度可分离卷积的卷积神经网络用于长时间序列预测。
Neural Netw. 2025 May;185:107139. doi: 10.1016/j.neunet.2025.107139. Epub 2025 Jan 16.
8
Developing a multivariate time series forecasting framework based on stacked autoencoders and multi-phase feature.基于堆叠自编码器和多阶段特征开发多元时间序列预测框架。
Heliyon. 2024 Mar 19;10(7):e27860. doi: 10.1016/j.heliyon.2024.e27860. eCollection 2024 Apr 15.
9
Emotion Forecasting: A Transformer-Based Approach.情感预测:一种基于Transformer的方法。
J Med Internet Res. 2025 Mar 18;27:e63962. doi: 10.2196/63962.
10
TLTNet: A novel transscale cascade layered transformer network for enhanced retinal blood vessel segmentation.TLTNet:一种新颖的跨尺度级联分层Transformer 网络,用于增强视网膜血管分割。
Comput Biol Med. 2024 Aug;178:108773. doi: 10.1016/j.compbiomed.2024.108773. Epub 2024 Jun 25.