• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于具有独特3D张量操作的普适表格数据的时间序列深度学习模型。

Time Sequence Deep Learning Model for Ubiquitous Tabular Data with Unique 3D Tensors Manipulation.

作者信息

Gicic Adaleta, Đonko Dženana, Subasi Abdulhamit

机构信息

Faculty of Electrical Engineering, University of Sarajevo, 71000 Sarajevo, Bosnia and Herzegovina.

Institute of Biomedicine, Faculty of Medicine, University of Turku, 20520 Turku, Finland.

出版信息

Entropy (Basel). 2024 Sep 12;26(9):783. doi: 10.3390/e26090783.

DOI:10.3390/e26090783
PMID:39330116
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11431205/
Abstract

Although deep learning (DL) algorithms have been proved to be effective in diverse research domains, their application in developing models for tabular data remains limited. Models trained on tabular data demonstrate higher efficacy using traditional machine learning models than DL models, which are largely attributed to the size and structure of tabular datasets and the specific application contexts in which they are utilized. Thus, the primary objective of this paper is to propose a method to use the supremacy of Stacked Bidirectional LSTM (Long Short-Term Memory) deep learning algorithms in pattern discovery incorporating tabular data with customized 3D tensor modeling in feeding neural networks. Our findings are empirically validated using six diverse, publicly available datasets each varying in size and learning objectives. This paper proves that the proposed model based on time-sequence DL algorithms, which were generally described as inadequate when dealing with tabular data, yields satisfactory results and competes effectively with other algorithms specifically designed for tabular data. An additional benefit of this approach is its ability to preserve simplicity while ensuring fast model training also with large datasets. Even with extremely small datasets, models can be applied to achieve exceptional predictive results and fully utilize their capacity.

摘要

尽管深度学习(DL)算法已被证明在不同的研究领域中是有效的,但其在开发表格数据模型方面的应用仍然有限。在表格数据上训练的模型使用传统机器学习模型比DL模型表现出更高的功效,这在很大程度上归因于表格数据集的大小和结构以及它们所使用的特定应用环境。因此,本文的主要目标是提出一种方法,利用堆叠双向长短期记忆(LSTM)深度学习算法在模式发现方面的优势,将表格数据与定制的3D张量建模相结合,输入神经网络。我们的研究结果通过六个不同的、公开可用的数据集进行了实证验证,每个数据集在大小和学习目标上各不相同。本文证明,基于时间序列DL算法的所提出的模型,虽然在处理表格数据时通常被认为不足,但产生了令人满意的结果,并能有效地与专门为表格数据设计的其他算法竞争。这种方法的另一个优点是,它能够在确保快速模型训练(即使是处理大型数据集)的同时保持简单性。即使使用极小的数据集,模型也可以应用以实现出色的预测结果并充分利用其能力。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd98/11431205/43bdd842b46c/entropy-26-00783-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd98/11431205/b86121348f67/entropy-26-00783-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd98/11431205/43bdd842b46c/entropy-26-00783-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd98/11431205/b86121348f67/entropy-26-00783-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd98/11431205/43bdd842b46c/entropy-26-00783-g002.jpg

相似文献

1
Time Sequence Deep Learning Model for Ubiquitous Tabular Data with Unique 3D Tensors Manipulation.用于具有独特3D张量操作的普适表格数据的时间序列深度学习模型。
Entropy (Basel). 2024 Sep 12;26(9):783. doi: 10.3390/e26090783.
2
Deep Neural Networks and Tabular Data: A Survey.深度神经网络与表格数据:一项综述。
IEEE Trans Neural Netw Learn Syst. 2024 Jun;35(6):7499-7519. doi: 10.1109/TNNLS.2022.3229161. Epub 2024 Jun 3.
3
Tabular deep learning: a comparative study applied to multi-task genome-wide prediction.表格深度学习:应用于多任务全基因组预测的比较研究。
BMC Bioinformatics. 2024 Oct 4;25(1):322. doi: 10.1186/s12859-024-05940-1.
4
Predicting recovery following stroke: Deep learning, multimodal data and feature selection using explainable AI.预测脑卒中后的恢复:使用可解释 AI 的深度学习、多模态数据和特征选择。
Neuroimage Clin. 2024;43:103638. doi: 10.1016/j.nicl.2024.103638. Epub 2024 Jul 2.
5
Perturbation of deep autoencoder weights for model compression and classification of tabular data.扰动深度自动编码器权重以进行模型压缩和表格数据分类。
Neural Netw. 2022 Dec;156:160-169. doi: 10.1016/j.neunet.2022.09.020. Epub 2022 Sep 27.
6
Graph Neural Network contextual embedding for Deep Learning on tabular data.图神经网络语境嵌入在表格数据上的深度学习。
Neural Netw. 2024 May;173:106180. doi: 10.1016/j.neunet.2024.106180. Epub 2024 Feb 16.
7
Optimizing neural networks for medical data sets: A case study on neonatal apnea prediction.优化神经网络在医学数据集上的应用:以新生儿呼吸暂停预测为例的研究
Artif Intell Med. 2019 Jul;98:59-76. doi: 10.1016/j.artmed.2019.07.008. Epub 2019 Jul 25.
8
Developing a multivariate time series forecasting framework based on stacked autoencoders and multi-phase feature.基于堆叠自编码器和多阶段特征开发多元时间序列预测框架。
Heliyon. 2024 Mar 19;10(7):e27860. doi: 10.1016/j.heliyon.2024.e27860. eCollection 2024 Apr 15.
9
Ensemble machine learning model trained on a new synthesized dataset generalizes well for stress prediction using wearable devices.在新合成数据集上训练的集成机器学习模型,对于使用可穿戴设备进行压力预测具有良好的泛化能力。
J Biomed Inform. 2023 Dec;148:104556. doi: 10.1016/j.jbi.2023.104556. Epub 2023 Dec 2.
10
Time series forecasting of new cases and new deaths rate for COVID-19 using deep learning methods.使用深度学习方法对COVID-19的新增病例和新增死亡率进行时间序列预测。
Results Phys. 2021 Aug;27:104495. doi: 10.1016/j.rinp.2021.104495. Epub 2021 Jun 26.

引用本文的文献

1
Prediction of one-year recurrence among breast cancer patients undergone surgery using artificial intelligence-based algorithms: a retrospective study on prognostic factors.使用基于人工智能的算法预测接受手术的乳腺癌患者的一年复发情况:一项关于预后因素的回顾性研究
BMC Cancer. 2025 May 26;25(1):940. doi: 10.1186/s12885-025-14369-5.
2
Explainable artificial intelligence for stroke prediction through comparison of deep learning and machine learning models.通过深度学习与机器学习模型比较实现可解释的人工智能用于中风预测
Sci Rep. 2024 Dec 28;14(1):31392. doi: 10.1038/s41598-024-82931-5.

本文引用的文献

1
Deep Neural Networks and Tabular Data: A Survey.深度神经网络与表格数据:一项综述。
IEEE Trans Neural Netw Learn Syst. 2024 Jun;35(6):7499-7519. doi: 10.1109/TNNLS.2022.3229161. Epub 2024 Jun 3.