Suppr超能文献

用于分类的大间隔加权k近邻标签分布学习

Large Margin Weighted k-Nearest Neighbors Label Distribution Learning for Classification.

作者信息

Wang Jing, Geng Xin

出版信息

IEEE Trans Neural Netw Learn Syst. 2024 Nov;35(11):16720-16732. doi: 10.1109/TNNLS.2023.3297261. Epub 2024 Oct 29.

Abstract

Label distribution learning (LDL) helps solve label ambiguity and has found wide applications. However, it may suffer from the challenge of objective inconsistency when adopted to classification problems because the learning objective of LDL is inconsistent with that of classification. Some LDL algorithms have been proposed to solve this issue, but they presume that label distribution can be represented by the maximum entropy model, which may not hold in many real-world problems. In this article, we design two novel LDL methods based on the k -nearest neighbors ( k NNs) approach without assuming any form of label distribution. First, we propose the large margin weighted k NN LDL (LW- k NNLDL). It learns a weight vector for the k NN algorithm to learn label distribution and implement a large margin to address the objective inconsistency. Second, we put forward the large margin distance-weighted k NN LDL (LD k NN-LDL) that learns distance-dependent weight vectors to consider the difference in the neighborhoods of different instances. Theoretical results show that our methods can learn any general-form label distribution. Moreover, extensive experimental studies validate that our methods significantly outperform state-of-the-art LDL approaches.

摘要

标签分布学习(LDL)有助于解决标签模糊性问题,并已得到广泛应用。然而,当将其应用于分类问题时,可能会面临目标不一致的挑战,因为LDL的学习目标与分类的学习目标不一致。已经提出了一些LDL算法来解决这个问题,但它们假定标签分布可以由最大熵模型表示,而这在许多实际问题中可能并不成立。在本文中,我们基于k近邻(k NN)方法设计了两种新颖的LDL方法,且不假定任何形式的标签分布。首先,我们提出了大间隔加权k NN LDL(LW-k NNLDL)。它为k NN算法学习一个权重向量,以学习标签分布并实现大间隔来解决目标不一致问题。其次,我们提出了大间隔距离加权k NN LDL(LD k NN-LDL),它学习依赖于距离的权重向量,以考虑不同实例邻域中的差异。理论结果表明,我们的方法可以学习任何一般形式的标签分布。此外,大量的实验研究验证了我们的方法显著优于现有的LDL方法。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验