Suppr超能文献

STTRE:一种用于多元时间序列预测的具有相对嵌入的时空转换器。

STTRE: A Spatio-Temporal Transformer with Relative Embeddings for multivariate time series forecasting.

机构信息

Department of Engineering, City University of London, Northampton Square, London, EC1V 0HB, England, United Kingdom.

Department of Computer Science, City University of London, Northampton Square, London, EC1V 0HB, England, United Kingdom.

出版信息

Neural Netw. 2023 Nov;168:549-559. doi: 10.1016/j.neunet.2023.09.039. Epub 2023 Sep 30.

Abstract

The prevalence of multivariate time series data across several disciplines fosters a demand and, subsequently, significant growth in the research and advancement of multivariate time series analysis. Drawing inspiration from a popular natural language processing model, the Transformer, we propose the Spatio-Temporal Transformer with Relative Embeddings (STTRE) to address multivariate time series forecasting. This work primarily focuses on developing a Transformer-based framework that can fully exploit the spatio-temporal nature of a multivariate time series by incorporating several of the Transformer's key components, but with augmentations that allow them to excel in multivariate time series forecasting. Current Transformer-based models for multivariate time series often neglect the data's spatial component(s) and utilize absolute position embeddings as their only means to detect the data's temporal component(s), which we show is flawed for time series applications. The lack of emphasis on fully exploiting the spatio-temporality of the data can incur subpar results in terms of accuracy. We redesign relative position representations, which we rename to relative embeddings, to unveil a new method for detecting latent spatial, temporal, and spatio-temporal dependencies more effectively than previous Transformer-based models. We couple these relative embeddings with a restructuring of the Transformer's primary sequence learning mechanism, multi-head attention, in a way that allows for full utilization of relative embeddings, thus achieving up to a 24% improvement in accuracy over other state-of-the-art multivariate time series models on a comprehensive selection of publicly available multivariate time series forecasting datasets.

摘要

跨多个学科的多元时间序列数据的流行,促进了对多元时间序列分析的研究和发展的需求和显著增长。受一种流行的自然语言处理模型——Transformer 的启发,我们提出了带有相对嵌入的时空 Transformer(STTRE)来解决多元时间序列预测问题。这项工作主要集中在开发一个基于 Transformer 的框架上,该框架可以通过整合 Transformer 的几个关键组件,充分利用多元时间序列的时空性质,但需要进行增强,以使其在多元时间序列预测中表现出色。当前基于 Transformer 的多元时间序列模型通常忽略数据的空间分量,并使用绝对位置嵌入作为唯一手段来检测数据的时间分量,我们表明这对于时间序列应用来说是有缺陷的。缺乏充分利用数据的时空性可能会导致在准确性方面的结果不佳。我们重新设计了相对位置表示,我们将其重新命名为相对嵌入,以揭示一种新的方法,可以比以前基于 Transformer 的模型更有效地检测潜在的空间、时间和时空依赖关系。我们将这些相对嵌入与 Transformer 的主要序列学习机制——多头注意力——进行了耦合,这种方式允许充分利用相对嵌入,从而在综合选择的多个公开可用的多元时间序列预测数据集中,在准确性方面比其他最先进的多元时间序列模型提高了 24%。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验