• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于时间序列插补的基于注意力机制的序列到序列模型。

Attention-Based Sequence-to-Sequence Model for Time Series Imputation.

作者信息

Li Yurui, Du Mingjing, He Sheng

机构信息

School of Computer Science and Technology, Jiangsu Normal University, Xuzhou 221116, China.

出版信息

Entropy (Basel). 2022 Dec 9;24(12):1798. doi: 10.3390/e24121798.

DOI:10.3390/e24121798
PMID:36554203
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9778091/
Abstract

Time series data are usually characterized by having missing values, high dimensionality, and large data volume. To solve the problem of high-dimensional time series with missing values, this paper proposes an attention-based sequence-to-sequence model to imputation missing values in time series (ASSM), which is a sequence-to-sequence model based on the combination of feature learning and data computation. The model consists of two parts, encoder and decoder. The encoder part is a BIGRU recurrent neural network and incorporates a self-attentive mechanism to make the model more capable of handling long-range time series; The decoder part is a GRU recurrent neural network and incorporates a cross-attentive mechanism into associate with the encoder part. The relationship weights between the generated sequences in the decoder part and the known sequences in the encoder part are calculated to achieve the purpose of focusing on the sequences with a high degree of correlation. In this paper, we conduct comparison experiments with four evaluation metrics and six models on four real datasets. The experimental results show that the model proposed in this paper outperforms the six comparative missing value interpolation algorithms.

摘要

时间序列数据通常具有缺失值、高维度和大数据量的特点。为了解决具有缺失值的高维时间序列问题,本文提出了一种基于注意力的序列到序列模型来插补时间序列中的缺失值(ASSM),这是一种基于特征学习和数据计算相结合的序列到序列模型。该模型由两部分组成,编码器和解码器。编码器部分是一个双向门控循环单元(BIGRU)递归神经网络,并结合了自注意力机制,使模型更有能力处理长程时间序列;解码器部分是一个门控循环单元(GRU)递归神经网络,并结合了交叉注意力机制与编码器部分相关联。计算解码器部分生成的序列与编码器部分已知序列之间的关系权重,以实现关注高度相关序列的目的。在本文中,我们在四个真实数据集上使用四个评估指标和六个模型进行了比较实验。实验结果表明,本文提出的模型优于六种比较缺失值插值算法。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/d2cd9c4c9fb8/entropy-24-01798-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/f1fc45ae2927/entropy-24-01798-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/164240c0289f/entropy-24-01798-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/dfa004649eeb/entropy-24-01798-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/84374239657e/entropy-24-01798-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/f36196c512e0/entropy-24-01798-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/b0eae1ab2f6a/entropy-24-01798-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/d2cd9c4c9fb8/entropy-24-01798-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/f1fc45ae2927/entropy-24-01798-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/164240c0289f/entropy-24-01798-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/dfa004649eeb/entropy-24-01798-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/84374239657e/entropy-24-01798-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/f36196c512e0/entropy-24-01798-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/b0eae1ab2f6a/entropy-24-01798-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e172/9778091/d2cd9c4c9fb8/entropy-24-01798-g007.jpg

相似文献

1
Attention-Based Sequence-to-Sequence Model for Time Series Imputation.用于时间序列插补的基于注意力机制的序列到序列模型。
Entropy (Basel). 2022 Dec 9;24(12):1798. doi: 10.3390/e24121798.
2
ContrAttNet: Contribution and attention approach to multivariate time-series data imputation.ContrAttNet:多元时间序列数据插补的贡献与注意力方法
Network. 2024 Jun 3:1-24. doi: 10.1080/0954898X.2024.2360157.
3
Recurrent Neural Networks for Multivariate Time Series with Missing Values.具有缺失值的多元时间序列的递归神经网络。
Sci Rep. 2018 Apr 17;8(1):6085. doi: 10.1038/s41598-018-24271-9.
4
A novel neural network for improved in-hospital mortality prediction with irregular and incomplete multivariate data.一种新型神经网络,可改善不规则和不完整多变量数据的住院死亡率预测。
Neural Netw. 2023 Oct;167:741-750. doi: 10.1016/j.neunet.2023.07.033. Epub 2023 Aug 12.
5
A novel Encoder-Decoder model based on read-first LSTM for air pollutant prediction.基于读优先 LSTM 的新型编解码器模型用于空气污染预测。
Sci Total Environ. 2021 Apr 15;765:144507. doi: 10.1016/j.scitotenv.2020.144507. Epub 2021 Jan 5.
6
A prediction and imputation method for marine animal movement data.一种海洋动物运动数据的预测与插补方法。
PeerJ Comput Sci. 2021 Aug 3;7:e656. doi: 10.7717/peerj-cs.656. eCollection 2021.
7
Attention-based Imputation of Missing Values in Electronic Health Records Tabular Data.电子健康记录表格数据中基于注意力机制的缺失值插补
Proc (IEEE Int Conf Healthc Inform). 2024 Jun;2024:177-182. doi: 10.1109/ichi61247.2024.00030. Epub 2024 Aug 22.
8
A PCA-EEMD-CNN-Attention-GRU-Encoder-Decoder Accurate Prediction Model for Key Parameters of Seawater Quality in Zhanjiang Bay.一种用于湛江湾海水水质关键参数的PCA-EEMD-CNN-注意力-GRU-编码器-解码器精确预测模型
Materials (Basel). 2022 Jul 27;15(15):5200. doi: 10.3390/ma15155200.
9
Dual Attention-Based Encoder-Decoder: A Customized Sequence-to-Sequence Learning for Soft Sensor Development.基于双注意力的编码器-解码器:用于软传感器开发的定制序列到序列学习。
IEEE Trans Neural Netw Learn Syst. 2021 Aug;32(8):3306-3317. doi: 10.1109/TNNLS.2020.3015929. Epub 2021 Aug 3.
10
A hierarchical temporal attention-based LSTM encoder-decoder model for individual mobility prediction.一种基于分层时间注意力机制的长短期记忆网络编码器-解码器模型用于个体移动性预测。
Neurocomputing (Amst). 2020 Aug 25;403:153-166. doi: 10.1016/j.neucom.2020.03.080. Epub 2020 May 1.

引用本文的文献

1
Deep learning in structural bioinformatics: current applications and future perspectives.结构生物信息学中的深度学习:当前应用与未来展望。
Brief Bioinform. 2024 Mar 27;25(3). doi: 10.1093/bib/bbae042.

本文引用的文献

1
Multi-source sequential knowledge regression by using transfer RNN units.基于转移 RNN 单元的多源序列知识回归。
Neural Netw. 2019 Nov;119:151-161. doi: 10.1016/j.neunet.2019.08.004. Epub 2019 Aug 17.
2
A Deep CNN-LSTM Model for Particulate Matter (PM) Forecasting in Smart Cities.基于深度学习的城市细颗粒物预测模型。
Sensors (Basel). 2018 Jul 10;18(7):2220. doi: 10.3390/s18072220.
3
Missing value imputation strategies for metabolomics data.代谢组学数据的缺失值插补策略。
Electrophoresis. 2015 Dec;36(24):3050-60. doi: 10.1002/elps.201500352. Epub 2015 Oct 20.
4
Impact of missing data imputation methods on gene expression clustering and classification.缺失数据插补方法对基因表达聚类和分类的影响。
BMC Bioinformatics. 2015 Feb 26;16:64. doi: 10.1186/s12859-015-0494-3.
5
Missing value imputation for microarray data: a comprehensive comparison study and a web tool.微阵列数据的缺失值插补:一项综合比较研究及网络工具
BMC Syst Biol. 2013;7 Suppl 6(Suppl 6):S12. doi: 10.1186/1752-0509-7-S6-S12. Epub 2013 Dec 13.
6
Missing data imputation using statistical and machine learning methods in a real breast cancer problem.在一个真实的乳腺癌问题中使用统计和机器学习方法进行缺失数据插补。
Artif Intell Med. 2010 Oct;50(2):105-15. doi: 10.1016/j.artmed.2010.05.002. Epub 2010 Jul 16.
7
Gaussian mixture clustering and imputation of microarray data.微阵列数据的高斯混合聚类与插补
Bioinformatics. 2004 Apr 12;20(6):917-23. doi: 10.1093/bioinformatics/bth007. Epub 2004 Jan 29.
8
Missing value estimation methods for DNA microarrays.DNA微阵列的缺失值估计方法。
Bioinformatics. 2001 Jun;17(6):520-5. doi: 10.1093/bioinformatics/17.6.520.