Suppr超能文献

用于主动脉夹层阻抗心动图的基于多保真数据和高斯过程的贝叶斯不确定性量化

Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection.

作者信息

Ranftl Sascha, Melito Gian Marco, Badeli Vahid, Reinbacher-Köstinger Alice, Ellermann Katrin, von der Linden Wolfgang

机构信息

Institute of Theoretical Physics-Computational Physics, Graz University of Technology, 8010 Graz, Austria.

Institute of Mechanics, Graz University of Technology, 8010 Graz, Austria.

出版信息

Entropy (Basel). 2019 Dec 31;22(1):58. doi: 10.3390/e22010058.

Abstract

In 2000, Kennedy and O'Hagan proposed a model for uncertainty quantification that combines data of several levels of sophistication, fidelity, quality, or accuracy, e.g., a coarse and a fine mesh in finite-element simulations. They assumed each level to be describable by a Gaussian process, and used low-fidelity simulations to improve inference on costly high-fidelity simulations. Departing from there, we move away from the common non-Bayesian practice of optimization and marginalize the parameters instead. Thus, we avoid the awkward logical dilemma of having to choose parameters and of neglecting that choice's uncertainty. We propagate the parameter uncertainties by averaging the predictions and the prediction uncertainties over all the possible parameters. This is done analytically for all but the nonlinear or inseparable kernel function parameters. What is left is a low-dimensional and feasible numerical integral depending on the choice of kernels, thus allowing for a fully Bayesian treatment. By quantifying the uncertainties of the parameters themselves too, we show that "learning" or optimising those parameters has little meaning when data is little and, thus, justify all our mathematical efforts. The recent hype about machine learning has long spilled over to computational engineering but fails to acknowledge that machine learning is a big data problem and that, in computational engineering, we usually face a little data problem. We devise the fully Bayesian uncertainty quantification method in a notation following the tradition of E.T. Jaynes and find that generalization to an arbitrary number of levels of fidelity and parallelisation becomes rather easy. We scrutinize the method with mock data and demonstrate its advantages in its natural application where high-fidelity data is little but low-fidelity data is not. We then apply the method to quantify the uncertainties in finite element simulations of impedance cardiography of aortic dissection. Aortic dissection is a cardiovascular disease that frequently requires immediate surgical treatment and, thus, a fast diagnosis before. While traditional medical imaging techniques such as computed tomography, magnetic resonance tomography, or echocardiography certainly do the job, Impedance cardiography too is a clinical standard tool and promises to allow earlier diagnoses as well as to detect patients that otherwise go under the radar for too long.

摘要

2000年,肯尼迪和奥黑根提出了一种不确定性量化模型,该模型结合了多个复杂程度、保真度、质量或精度水平的数据,例如有限元模拟中的粗网格和细网格。他们假设每个水平都可以用高斯过程来描述,并使用低保真模拟来改进对成本高昂的高保真模拟的推断。从那里开始,我们摒弃了常见的非贝叶斯优化方法,转而对参数进行边缘化。因此,我们避免了必须选择参数以及忽略该选择的不确定性这一尴尬的逻辑困境。我们通过对所有可能参数上的预测和预测不确定性进行平均来传播参数不确定性。对于除非线性或不可分离核函数参数之外的所有参数,这都是通过解析方法完成的。剩下的是一个低维且可行的数值积分,它取决于核函数的选择,从而实现了完全的贝叶斯处理。通过也对参数本身的不确定性进行量化,我们表明当数据很少时,“学习”或优化这些参数几乎没有意义,从而证明了我们所有数学努力的合理性。最近关于机器学习的炒作早已蔓延到计算工程领域,但却没有认识到机器学习是一个大数据问题,而在计算工程中,我们通常面临的是小数据问题。我们按照E.T.杰恩斯的传统符号设计了完全贝叶斯不确定性量化方法,发现推广到任意数量的保真度水平和并行化变得相当容易。我们用模拟数据仔细研究了该方法,并在高保真数据少但低保真数据不少的自然应用中展示了它的优势。然后,我们将该方法应用于量化主动脉夹层阻抗心动图有限元模拟中的不确定性。主动脉夹层是一种心血管疾病,经常需要立即进行手术治疗,因此需要在此之前快速诊断。虽然传统的医学成像技术,如计算机断层扫描、磁共振断层扫描或超声心动图当然可以完成这项工作,但阻抗心动图也是一种临床标准工具,有望实现更早的诊断,并检测出那些否则会长期被忽视的患者。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/479b/7516489/5d1c02d51f59/entropy-22-00058-g001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验