Suppr超能文献

信息论在临床诊断测试中的应用综述

A Review of the Application of Information Theory to Clinical Diagnostic Testing.

作者信息

Benish William A

机构信息

Department of Internal Medicine, Case Western Reserve University, Cleveland, OH 44106, USA.

出版信息

Entropy (Basel). 2020 Jan 14;22(1):97. doi: 10.3390/e22010097.

Abstract

The fundamental information theory functions of entropy, relative entropy, and mutual information are directly applicable to clinical diagnostic testing. This is a consequence of the fact that an individual's disease state and diagnostic test result are random variables. In this paper, we review the application of information theory to the quantification of diagnostic uncertainty, diagnostic information, and diagnostic test performance. An advantage of information theory functions over more established test performance measures is that they can be used when multiple disease states are under consideration as well as when the diagnostic test can yield multiple or continuous results. Since more than one diagnostic test is often required to help determine a patient's disease state, we also discuss the application of the theory to situations in which more than one diagnostic test is used. The total diagnostic information provided by two or more tests can be partitioned into meaningful components.

摘要

熵、相对熵和互信息的基本信息论函数可直接应用于临床诊断测试。这是因为个体的疾病状态和诊断测试结果是随机变量。在本文中,我们回顾了信息论在诊断不确定性、诊断信息和诊断测试性能量化方面的应用。与更成熟的测试性能指标相比,信息论函数的一个优势在于,当考虑多种疾病状态以及诊断测试可产生多个或连续结果时,它们都能适用。由于通常需要不止一项诊断测试来帮助确定患者的疾病状态,我们还讨论了该理论在使用多项诊断测试的情况下的应用。两项或更多测试提供的总诊断信息可被划分为有意义的组成部分。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0939/7516534/5936a81594a3/entropy-22-00097-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验