Yang Le, Song Shiji, Gong Yanshang, Gao Huang, Wu Cheng
IEEE Trans Neural Netw Learn Syst. 2019 Oct;30(10):3205-3210. doi: 10.1109/TNNLS.2018.2890103. Epub 2019 Jan 23.
In this brief, we propose a novel nonparametric supervised linear dimension reduction (SLDR) algorithm that extracts the features by maximizing the pairwise separation probability. The separation probability, as a new class separability measure, describes the generalization accuracy when we use the obtained features to train a linear classifier. Obtaining high-quality features, the proposed method avoids the overlaps between classes that are close to each other in the input space and improves the subsequent classification performance. Experiments on benchmark data sets show the superiority of the proposed algorithm over some other state-of-the-art SLDR methods.
在本简报中,我们提出了一种新颖的非参数监督线性降维(SLDR)算法,该算法通过最大化成对分离概率来提取特征。分离概率作为一种新的类可分离性度量,描述了我们使用所获得的特征训练线性分类器时的泛化精度。通过获得高质量的特征,该方法避免了在输入空间中彼此接近的类之间的重叠,并提高了后续的分类性能。在基准数据集上的实验表明,该算法优于其他一些最新的SLDR方法。