Peng Jing, Heisterkamp Douglas R, Dai H K
Electrical Engineering and Computer Science Department, Tulane University, New Orleans, LA 70118, USA.
IEEE Trans Pattern Anal Mach Intell. 2004 May;26(5):656-61. doi: 10.1109/TPAMI.2004.1273978.
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-of-dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels to compute neighborhoods over which the class probabilities tend to be more homogeneous. As a result, better classification performance can be expected. The efficacy of our method is validated and compared against other competing techniques using a variety of data sets.
最近邻分类假定类条件概率在局部是恒定的。由于维数灾难,这一假设在高维情况下变得无效。在这些条件下使用最近邻规则时,可能会引入严重的偏差。我们提出一种自适应最近邻分类方法,试图将偏差最小化。我们使用拟共形变换核来计算邻域,在这些邻域上类概率趋于更加均匀。因此,可以预期会有更好的分类性能。我们的方法的有效性通过使用各种数据集进行验证,并与其他竞争技术进行比较。