• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种基于改进变压器的交通流预测模型。

An improved transformer based traffic flow prediction model.

作者信息

Liu Shipeng, Wang Xingjian

机构信息

College of Computer and Control Engineering, Northeast Forestry University, HeXing Road, Harbin, China.

出版信息

Sci Rep. 2025 Mar 10;15(1):8284. doi: 10.1038/s41598-025-92425-7.

DOI:10.1038/s41598-025-92425-7
PMID:40065142
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11893897/
Abstract

Traffic flow prediction is a key challenge in intelligent transportation, and the ability to accurately forecast future traffic flow directly affects the efficiency of urban transportation systems. However, existing deep learning-based prediction models suffer from the following issues: First, CNN- or RNN-based models are limited by their architecture and unsuitable for modeling long-term sequences. Second, most Transformer-based methods focus solely on the traffic flow data itself during embedding, neglecting the implicit information behind the traffic data. This implicit information includes behavioral trends, community and surrounding traffic patterns, urban weather, semantic information, and temporal periodicity. Third, methods using the original multi-head self-attention mechanism calculate attention scores point by point in the temporal dimension without utilizing contextual information, which to some extent leads to less accurate attention computation. Fourth, existing methods struggle to capture long and short-range spatial dependencies simultaneously. To address these four issues, we propose an IEEAFormer technique (Implicit-information Embedding and Enhanced Spatial-Temporal Multi-Head Attention Transformer). First, it adopts a Transformer architecture and incorporates an embedding layer to capture implicit information in the input. Secondly, the method replaces the traditional multi-head self-attention with time-environment-aware self-attention in the temporal dimension, enabling each node to perceive the contextual environment. Additionally, the technique uses two unique graph mask matrices in the spatial dimension. It employs a novel parallel spatial self-attention architecture to capture both long-range and short-range dependencies in the data simultaneously. The results verified on four real-world traffic datasets show that the proposed IEEAFormer outperforms most existing models regarding prediction performance.

摘要

交通流预测是智能交通中的一项关键挑战,准确预测未来交通流的能力直接影响城市交通系统的效率。然而,现有的基于深度学习的预测模型存在以下问题:第一,基于卷积神经网络(CNN)或循环神经网络(RNN)的模型受其架构限制,不适用于对长期序列进行建模。第二,大多数基于Transformer的方法在嵌入过程中仅关注交通流数据本身,忽略了交通数据背后的隐含信息。这种隐含信息包括行为趋势、社区和周边交通模式、城市天气、语义信息以及时间周期性。第三,使用原始多头自注意力机制的方法在时间维度上逐点计算注意力分数,未利用上下文信息,这在一定程度上导致注意力计算不够准确。第四,现有方法难以同时捕捉长距离和短距离空间依赖性。为了解决这四个问题,我们提出了一种IEEAFormer技术(隐含信息嵌入与增强时空多头注意力Transformer)。首先,它采用Transformer架构并结合一个嵌入层来捕捉输入中的隐含信息。其次,该方法在时间维度上用时间环境感知自注意力取代传统的多头自注意力,使每个节点能够感知上下文环境。此外,该技术在空间维度上使用两个独特的图掩码矩阵。它采用一种新颖的并行空间自注意力架构来同时捕捉数据中的长距离和短距离依赖性。在四个真实世界交通数据集上验证的结果表明,所提出的IEEAFormer在预测性能方面优于大多数现有模型。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/8e7f86864f06/41598_2025_92425_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/4e4183417670/41598_2025_92425_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/664a654935d0/41598_2025_92425_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/faa32d96f9d6/41598_2025_92425_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/4ca3ea6c664e/41598_2025_92425_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/b0a97e6d9e73/41598_2025_92425_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/96cb8aab79e2/41598_2025_92425_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/030a3cda8db4/41598_2025_92425_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/f0525e17d0f8/41598_2025_92425_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/499d82537cd5/41598_2025_92425_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/1d36b92d58d2/41598_2025_92425_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/8e7f86864f06/41598_2025_92425_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/4e4183417670/41598_2025_92425_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/664a654935d0/41598_2025_92425_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/faa32d96f9d6/41598_2025_92425_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/4ca3ea6c664e/41598_2025_92425_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/b0a97e6d9e73/41598_2025_92425_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/96cb8aab79e2/41598_2025_92425_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/030a3cda8db4/41598_2025_92425_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/f0525e17d0f8/41598_2025_92425_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/499d82537cd5/41598_2025_92425_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/1d36b92d58d2/41598_2025_92425_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/385c/11893897/8e7f86864f06/41598_2025_92425_Fig11_HTML.jpg

相似文献

1
An improved transformer based traffic flow prediction model.一种基于改进变压器的交通流预测模型。
Sci Rep. 2025 Mar 10;15(1):8284. doi: 10.1038/s41598-025-92425-7.
2
An Adaptive Spatio-Temporal Traffic Flow Prediction Using Self-Attention and Multi-Graph Networks.一种基于自注意力和多图网络的自适应时空交通流预测方法
Sensors (Basel). 2025 Jan 6;25(1):282. doi: 10.3390/s25010282.
3
Linear attention based spatiotemporal multi graph GCN for traffic flow prediction.基于线性注意力的时空多图GCN用于交通流预测。
Sci Rep. 2025 Mar 10;15(1):8249. doi: 10.1038/s41598-025-93179-y.
4
TEA-GCN: Transformer-Enhanced Adaptive Graph Convolutional Network for Traffic Flow Forecasting.TEA-GCN:用于交通流量预测的Transformer增强自适应图卷积网络
Sensors (Basel). 2024 Nov 4;24(21):7086. doi: 10.3390/s24217086.
5
Spatio-temporal causal graph attention network for traffic flow prediction in intelligent transportation systems.智能交通系统中用于交通流预测的时空因果图注意力网络
PeerJ Comput Sci. 2023 Jul 28;9:e1484. doi: 10.7717/peerj-cs.1484. eCollection 2023.
6
Spatial-Temporal Attention Mechanism and Graph Convolutional Networks for Destination Prediction.用于目的地预测的时空注意力机制与图卷积网络
Front Neurorobot. 2022 Jul 6;16:925210. doi: 10.3389/fnbot.2022.925210. eCollection 2022.
7
Multi-Granularity Temporal Embedding Transformer Network for Traffic Flow Forecasting.用于交通流预测的多粒度时间嵌入Transformer网络
Sensors (Basel). 2024 Dec 19;24(24):8106. doi: 10.3390/s24248106.
8
Deep transformer-based heterogeneous spatiotemporal graph learning for geographical traffic forecasting.基于深度Transformer的异构时空图学习用于地理交通预测
iScience. 2024 Jun 25;27(7):110175. doi: 10.1016/j.isci.2024.110175. eCollection 2024 Jul 19.
9
DyGraphformer: Transformer combining dynamic spatio-temporal graph network for multivariate time series forecasting.DyGraphformer:结合动态时空图网络的Transformer用于多变量时间序列预测。
Neural Netw. 2025 Jan;181:106776. doi: 10.1016/j.neunet.2024.106776. Epub 2024 Oct 17.
10
Multicomponent Spatial-Temporal Graph Attention Convolution Networks for Traffic Prediction with Spatially Sparse Data.具有空间稀疏数据的交通预测的多分量时空图注意卷积网络。
Comput Intell Neurosci. 2021 Dec 23;2021:9134942. doi: 10.1155/2021/9134942. eCollection 2021.

引用本文的文献

1
Spatio-temporal transformer and graph convolutional networks based traffic flow prediction.基于时空变换器和图卷积网络的交通流预测
Sci Rep. 2025 Jul 7;15(1):24299. doi: 10.1038/s41598-025-10287-5.

本文引用的文献

1
A multi-feature spatial-temporal fusion network for traffic flow prediction.一种用于交通流预测的多特征时空融合网络。
Sci Rep. 2024 Jun 20;14(1):14264. doi: 10.1038/s41598-024-65040-1.