Electronics Laboratories, University of Kent at Canterbury, Kent CT2 7NT, England.
IEEE Trans Pattern Anal Mach Intell. 1983 Jun;5(6):661-4. doi: 10.1109/tpami.1983.4767458.
The existence of an upper bound for the error probability as a function of I-divergences between an original and an approximating distribution is proved. Such a bound is shown to be a monotonic nondecreasing function of the I-divergences, reaching the Bayes error probability when they vanish. It has been shown that if the closeness between the original and approximating distributions is assessed by the probability of error associated with a particular two-class recognition problem in which those functions are the class conditional distributions, then the best upper bound for such probability is ¿ regardless of the value of the I-divergences between them. Approaching the approximation problem from a rather different viewpoint, this correspondence considers the problem of a two-class discrete measurement classification where the original distributions are replaced by approximations, and its effects on the probability of error. The corresponding analysis is presented in detail.
证明了误差概率的上界作为原始分布和近似分布之间的 I 散度的函数的存在性。 这种界限被证明是 I 散度的单调非递减函数,当它们消失时达到贝叶斯误差概率。 已经表明,如果通过与特定的两类识别问题相关联的误差概率来评估原始分布和近似分布之间的接近程度,其中这些函数是类条件分布,那么对于这种概率,最好的上限是¿无论它们之间的 I 散度值如何。 从相当不同的角度接近近似问题,本研究考虑了离散测量分类的两类问题,其中原始分布被近似所取代,以及其对误差概率的影响。 详细介绍了相应的分析。