Benish William A
Department of Internal Medicine, Case Western Reserve University, Cleveland, OH 44106, USA.
Entropy (Basel). 2020 Jan 14;22(1):97. doi: 10.3390/e22010097.
The fundamental information theory functions of entropy, relative entropy, and mutual information are directly applicable to clinical diagnostic testing. This is a consequence of the fact that an individual's disease state and diagnostic test result are random variables. In this paper, we review the application of information theory to the quantification of diagnostic uncertainty, diagnostic information, and diagnostic test performance. An advantage of information theory functions over more established test performance measures is that they can be used when multiple disease states are under consideration as well as when the diagnostic test can yield multiple or continuous results. Since more than one diagnostic test is often required to help determine a patient's disease state, we also discuss the application of the theory to situations in which more than one diagnostic test is used. The total diagnostic information provided by two or more tests can be partitioned into meaningful components.
熵、相对熵和互信息的基本信息论函数可直接应用于临床诊断测试。这是因为个体的疾病状态和诊断测试结果是随机变量。在本文中,我们回顾了信息论在诊断不确定性、诊断信息和诊断测试性能量化方面的应用。与更成熟的测试性能指标相比,信息论函数的一个优势在于,当考虑多种疾病状态以及诊断测试可产生多个或连续结果时,它们都能适用。由于通常需要不止一项诊断测试来帮助确定患者的疾病状态,我们还讨论了该理论在使用多项诊断测试的情况下的应用。两项或更多测试提供的总诊断信息可被划分为有意义的组成部分。