Adler Andy, Youmaran Richard, Lionheart William R B
Systems and Computer Engineering, Carleton University, Ottawa, Canada.
Physiol Meas. 2008 Jun;29(6):S101-9. doi: 10.1088/0967-3334/29/6/S09. Epub 2008 Jun 10.
We ask: how many bits of information (in the Shannon sense) do we get from a set of EIT measurements? Here, the term information in measurements (IM) is defined as: the decrease in uncertainty about the contents of a medium, due to a set of measurements. This decrease in uncertainty is quantified by the change from the inter-class model, q, defined by the prior information, to the intra-class model, p, given by the measured data (corrupted by noise). IM is measured by the expected relative entropy (Kullback-Leibler divergence) between distributions q and p, and corresponds to the channel capacity in an analogous communications system. Based on a Gaussian model of the measurement noise, (Sigma(n)), and a prior model of the image element covariances (Sigma(x)), we calculate IM = 1/2 summation operator log(2)(SNR + 1), where SNR is the signal-to-noise ratio for each independent measurement calculated from the prior and noise models. For an example, we consider saline tank measurements from a 16 electrode EIT system, with a 2 cm radius non-conductive target, and calculate IM =179 bits. Temporal sequences of frames are considered, and formulae for IM as a function of temporal image element correlations are derived. We suggest that this measure may allow novel insights into questions such as distinguishability limits, optimal measurement schemes and data fusion.
从一组电阻抗断层成像(EIT)测量中我们能获得多少比特信息(从香农意义上讲)?在此,测量中的信息(IM)这一术语定义为:由于一组测量而导致的关于介质内容不确定性的降低。这种不确定性的降低通过从由先验信息定义的类间模型q到由测量数据(受噪声干扰)给出的类内模型p的变化来量化。IM通过分布q和p之间的期望相对熵(库尔贝克 - 莱布勒散度)来衡量,并且对应于类似通信系统中的信道容量。基于测量噪声的高斯模型(Sigma(n))和图像元素协方差的先验模型(Sigma(x)),我们计算得到IM = 1/2求和算子log(2)(SNR + 1),其中SNR是根据先验模型和噪声模型计算出的每个独立测量的信噪比。例如,我们考虑来自一个16电极EIT系统对半径为2厘米的非导电目标的盐水箱测量,并计算出IM = 179比特。我们考虑了帧的时间序列,并推导了作为时间图像元素相关性函数的IM公式。我们认为这种度量可能会为诸如可区分性极限、最优测量方案和数据融合等问题带来新的见解。