Suppr超能文献

TransformerLSR:具有并发潜在结构的纵向数据、生存和复发事件的注意力联合模型。

TransformerLSR: Attentive joint model of longitudinal data, survival, and recurrent events with concurrent latent structure.

作者信息

Zhang Zhiyue, Zhao Yao, Xu Yanxun

机构信息

Department of Applied Mathematics and Statistics, Johns Hopkins University, 3100 Wyman Park Dr, Baltimore, 21211, MD, USA.

Department of Applied Mathematics and Statistics, Johns Hopkins University, 3100 Wyman Park Dr, Baltimore, 21211, MD, USA; Division of Biostatistics and Bioinformatics, School of Medicine, Johns Hopkins University, 733 N Broadway, Baltimore, 21205, MD, USA.

出版信息

Artif Intell Med. 2025 Feb;160:103056. doi: 10.1016/j.artmed.2024.103056. Epub 2024 Dec 16.

Abstract

In applications such as biomedical studies, epidemiology, and social sciences, recurrent events often co-occur with longitudinal measurements and a terminal event, such as death. Therefore, jointly modeling longitudinal measurements, recurrent events, and survival data while accounting for their dependencies is critical. While joint models for the three components exist in statistical literature, many of these approaches are limited by heavy parametric assumptions and scalability issues. Recently, incorporating deep learning techniques into joint modeling has shown promising results. However, current methods only address joint modeling of longitudinal measurements at regularly-spaced observation times and survival events, neglecting recurrent events. In this paper, we develop TransformerLSR, a flexible transformer-based deep modeling and inference framework to jointly model all three components simultaneously. TransformerLSR integrates deep temporal point processes into the joint modeling framework, treating recurrent and terminal events as two competing processes dependent on past longitudinal measurements and recurrent event times. Additionally, TransformerLSR introduces a novel trajectory representation and model architecture to potentially incorporate a priori knowledge of known latent structures among concurrent longitudinal variables. We demonstrate the effectiveness and necessity of TransformerLSR through simulation studies and analyzing a real-world medical dataset on patients after kidney transplantation.

摘要

在生物医学研究、流行病学和社会科学等应用中,复发事件通常与纵向测量以及诸如死亡等终末事件同时发生。因此,在考虑它们的依赖性的同时,对纵向测量、复发事件和生存数据进行联合建模至关重要。虽然统计文献中存在针对这三个组成部分的联合模型,但这些方法中的许多都受到严格的参数假设和可扩展性问题的限制。最近,将深度学习技术纳入联合建模已显示出有前景的结果。然而,当前方法仅处理在等间隔观测时间的纵向测量与生存事件的联合建模,而忽略了复发事件。在本文中,我们开发了TransformerLSR,这是一个基于灵活的Transformer的深度建模和推理框架,用于同时对所有三个组成部分进行联合建模。TransformerLSR将深度时间点过程集成到联合建模框架中,将复发事件和终末事件视为两个相互竞争的过程,它们依赖于过去的纵向测量和复发事件时间。此外,TransformerLSR引入了一种新颖的轨迹表示和模型架构,以潜在地纳入并发纵向变量之间已知潜在结构的先验知识。我们通过模拟研究和分析肾移植后患者的真实世界医学数据集,证明了TransformerLSR的有效性和必要性。

相似文献

本文引用的文献

3
Dynamic Prediction in Clinical Survival Analysis Using Temporal Convolutional Networks.基于时间卷积网络的临床生存分析中的动态预测。
IEEE J Biomed Health Inform. 2020 Feb;24(2):424-436. doi: 10.1109/JBHI.2019.2929264. Epub 2019 Jul 17.
9
Understanding Medication Nonadherence after Kidney Transplant.肾移植后药物治疗不依从性的理解
J Am Soc Nephrol. 2017 Aug;28(8):2290-2301. doi: 10.1681/ASN.2017020216. Epub 2017 Jun 19.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验