Suppr超能文献

基于注意力架构的神经网络对于重症监护病房患者早期预测医院死亡率的数据缺失具有鲁棒性。

Neural networks based on attention architecture are robust to data missingness for early predicting hospital mortality in intensive care unit patients.

作者信息

Zeng Zhixuan, Liu Yang, Yao Shuo, Liu Jiqiang, Xiao Bing, Liu Chenxue, Gong Xun

机构信息

Department of Emergency Medicine, The Second Xiangya Hospital of Central South University, Changsha, China.

Department of Rehabilitation, The Second Xiangya Hospital of Central South University, Changsha, China.

出版信息

Digit Health. 2023 May 7;9:20552076231171482. doi: 10.1177/20552076231171482. eCollection 2023 Jan-Dec.

Abstract

BACKGROUND

Although the machine learning model developed on electronic health records has become a promising method for early predicting hospital mortality, few studies focus on the approaches for handling missing data in electronic health records and evaluate model robustness to data missingness. This study proposes an attention architecture that shows excellent predictive performance and is robust to data missingness.

METHODS

Two public intensive care unit databases were used for model training and external validation, respectively. Three neural networks (masked attention model, attention model with imputation, attention model with missing indicator) based on the attention architecture were developed, using masked attention mechanism, multiple imputation, and missing indicator to handle missing data, respectively. Model interpretability was analyzed by attention allocations. Extreme gradient boosting, logistic regression with multiple imputation and missing indicator (logistic regression with imputation, logistic regression with missing indicator) were used as baseline models. Model discrimination and calibration were evaluated by area under the receiver operating characteristic curve, area under precision-recall curve, and calibration curve. In addition, model robustness to data missingness in both model training and validation was evaluated by three analyses.

RESULTS

In total, 65,623 and 150,753 intensive care unit stays were respectively included in the training set and the test set, with mortality of 10.1% and 8.5%, and overall missing rate of 10.3% and 19.7%. attention model with missing indicator had the highest area under the receiver operating characteristic curve (0.869; 95% CI: 0.865 to 0.873) in external validation; attention model with imputation had the highest area under precision-recall curve (0.497; 95% CI: 0.480-0.513). Masked attention model and attention model with imputation showed better calibration than other models. The three neural networks showed different patterns of attention allocation. In terms of robustness to data missingness, masked attention model and attention model with missing indicator are more robust to missing data in model training; while attention model with imputation is more robust to missing data in model validation.

CONCLUSIONS

The attention architecture has the potential to become an excellent model architecture for clinical prediction task with data missingness.

摘要

背景

尽管基于电子健康记录开发的机器学习模型已成为早期预测医院死亡率的一种有前景的方法,但很少有研究关注电子健康记录中缺失数据的处理方法,也很少评估模型对数据缺失的稳健性。本研究提出了一种注意力架构,该架构显示出优异的预测性能且对数据缺失具有稳健性。

方法

分别使用两个公共重症监护病房数据库进行模型训练和外部验证。基于注意力架构开发了三种神经网络(掩码注意力模型、插补注意力模型、带缺失指示符的注意力模型),分别使用掩码注意力机制、多重插补和缺失指示符来处理缺失数据。通过注意力分配分析模型可解释性。使用极端梯度提升、带多重插补和缺失指示符的逻辑回归(插补逻辑回归、带缺失指示符的逻辑回归)作为基线模型。通过受试者操作特征曲线下面积、精确召回率曲线下面积和校准曲线评估模型判别力和校准度。此外,通过三项分析评估模型在训练和验证中对数据缺失的稳健性。

结果

训练集和测试集分别纳入了65,623例和150,753例重症监护病房住院病例,死亡率分别为10.1%和8.5%,总体缺失率分别为10.3%和19.7%。在外部验证中,带缺失指示符的注意力模型的受试者操作特征曲线下面积最高(0.869;95%置信区间:0.865至0.873);插补注意力模型的精确召回率曲线下面积最高(0.497;95%置信区间:0.480 - 0.513)。掩码注意力模型和插补注意力模型的校准度优于其他模型。这三种神经网络显示出不同的注意力分配模式。在对数据缺失的稳健性方面,掩码注意力模型和带缺失指示符的注意力模型在模型训练中对缺失数据更稳健;而插补注意力模型在模型验证中对缺失数据更稳健。

结论

注意力架构有潜力成为用于处理存在数据缺失的临床预测任务的优秀模型架构。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9fe3/10170607/525beced93c3/10.1177_20552076231171482-fig1.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验