IEEE Trans Image Process. 2017 Feb;26(2):660-670. doi: 10.1109/TIP.2016.2631887. Epub 2016 Nov 22.
Metric learning has attracted increasing attention due to its critical role in image analysis and classification. Conventional metric learning always assumes that the training and test data are sampled from the same or similar distribution. However, to build an effective distance metric, we need abundant supervised knowledge (i.e., side/label information), which is generally inaccessible in practice, because of the expensive labeling cost. In this paper, we develop a robust transfer metric learning (RTML) framework to effectively assist the unlabeled target learning by transferring the knowledge from the well-labeled source domain. Specifically, RTML exploits knowledge transfer to mitigate the domain shift in two directions, i.e., sample space and feature space. In the sample space, domain-wise and class-wise adaption schemes are adopted to bridge the gap of marginal and conditional distribution disparities across two domains. In the feature space, our metric is built in a marginalized denoising fashion and low-rank constraint, which make it more robust to tackle noisy data in reality. Furthermore, we design an explicit rank constraint regularizer to replace the rank minimization NP-hard problem to guide the low-rank metric learning. Experimental results on several standard benchmarks demonstrate the effectiveness of our proposed RTML by comparing it with the state-of-the-art transfer learning and metric learning algorithms.
由于在图像分析和分类中具有关键作用,度量学习吸引了越来越多的关注。传统的度量学习总是假设训练数据和测试数据是从相同或相似的分布中采样的。然而,为了构建有效的距离度量,我们需要丰富的监督知识(即,边/标签信息),但由于标记成本高昂,这在实践中通常是无法获得的。在本文中,我们开发了一种鲁棒的迁移度量学习(RTML)框架,通过从标记良好的源域中转移知识,有效地帮助未标记的目标学习。具体来说,RTML 利用知识转移来减轻两个方向的域转移,即样本空间和特征空间。在样本空间中,采用域内和类内自适应方案来弥合两个域之间的边缘分布和条件分布差异。在特征空间中,我们的度量是以边缘化去噪的方式和低秩约束构建的,这使得它在处理现实中存在的噪声数据时更加稳健。此外,我们设计了一个显式秩约束正则化器来替代秩最小化的 NP 难问题,以指导低秩度量学习。在几个标准基准上的实验结果表明,与最先进的迁移学习和度量学习算法相比,我们提出的 RTML 是有效的。