• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于类簇损失的知识增强原型网络的小样本关系分类。

Knowledge-enhanced prototypical network with class cluster loss for few-shot relation classification.

机构信息

College of Software, Xinjiang University, Urumqi, China.

Xinjiang Multilingual Information Technology Laboratory, Xinjiang University, Urumqi, China.

出版信息

PLoS One. 2023 Jun 8;18(6):e0286915. doi: 10.1371/journal.pone.0286915. eCollection 2023.

DOI:10.1371/journal.pone.0286915
PMID:37289767
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10249838/
Abstract

Few-shot Relation Classification identifies the relation between target entity pairs in unstructured natural language texts by training on a small number of labeled samples. Recent prototype network-based studies have focused on enhancing the prototype representation capability of models by incorporating external knowledge. However, the majority of these works constrain the representation of class prototypes implicitly through complex network structures, such as multi-attention mechanisms, graph neural networks, and contrastive learning, which constrict the model's ability to generalize. In addition, most models with triplet loss disregard intra-class compactness during model training, thereby limiting the model's ability to handle outlier samples with low semantic similarity. Therefore, this paper proposes a non-weighted prototype enhancement module that uses the feature-level similarity between prototypes and relation information as a gate to filter and complete features. Meanwhile, we design a class cluster loss that samples difficult positive and negative samples and explicitly constrains both intra-class compactness and inter-class separability to learn a metric space with high discriminability. Extensive experiments were done on the publicly available dataset FewRel 1.0 and 2.0, and the results show the effectiveness of the proposed model.

摘要

少量样本关系分类通过在少量标记样本上进行训练,识别非结构化自然语言文本中目标实体对之间的关系。最近基于原型网络的研究主要集中在通过整合外部知识来增强模型的原型表示能力。然而,这些工作中的大多数通过复杂的网络结构(如多注意机制、图神经网络和对比学习)隐式地约束类原型的表示,从而限制了模型的泛化能力。此外,大多数具有三元组损失的模型在模型训练过程中忽略了类内紧致性,从而限制了模型处理具有低语义相似性的异常样本的能力。因此,本文提出了一种非加权原型增强模块,该模块使用原型和关系信息之间的特征级相似度作为门来过滤和完成特征。同时,我们设计了一种类簇损失,可以对困难的正例和负例进行采样,并显式地约束类内紧致性和类间可分离性,以学习具有高可区分性的度量空间。在公开数据集 FewRel 1.0 和 2.0 上进行了广泛的实验,结果表明了所提出模型的有效性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/5eb33de6466e/pone.0286915.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/b5f6695a8a24/pone.0286915.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/fa6e9a66921c/pone.0286915.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/7e889f540273/pone.0286915.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/0c75e18d295c/pone.0286915.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/59805bdc18d1/pone.0286915.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/90384db20a47/pone.0286915.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/5eb33de6466e/pone.0286915.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/b5f6695a8a24/pone.0286915.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/fa6e9a66921c/pone.0286915.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/7e889f540273/pone.0286915.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/0c75e18d295c/pone.0286915.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/59805bdc18d1/pone.0286915.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/90384db20a47/pone.0286915.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2da4/10249838/5eb33de6466e/pone.0286915.g007.jpg

相似文献

1
Knowledge-enhanced prototypical network with class cluster loss for few-shot relation classification.基于类簇损失的知识增强原型网络的小样本关系分类。
PLoS One. 2023 Jun 8;18(6):e0286915. doi: 10.1371/journal.pone.0286915. eCollection 2023.
2
Improving few-shot relation extraction through semantics-guided learning.通过语义引导学习提高小样本关系抽取。
Neural Netw. 2024 Jan;169:453-461. doi: 10.1016/j.neunet.2023.10.053. Epub 2023 Nov 3.
3
Adaptive class augmented prototype network for few-shot relation extraction.用于少样本关系抽取的自适应类别增强原型网络。
Neural Netw. 2024 Jan;169:134-142. doi: 10.1016/j.neunet.2023.10.025. Epub 2023 Oct 19.
4
Adaptive Prototypical Networks With Label Words and Joint Representation Learning for Few-Shot Relation Classification.基于标签词和联合表示学习的自适应原型网络用于少样本关系分类
IEEE Trans Neural Netw Learn Syst. 2023 Mar;34(3):1406-1417. doi: 10.1109/TNNLS.2021.3105377. Epub 2023 Feb 28.
5
DiffFSRE: Diffusion-Enhanced Prototypical Network for Few-Shot Relation Extraction.DiffFSRE:用于少样本关系抽取的扩散增强原型网络。
Entropy (Basel). 2024 Apr 23;26(5):352. doi: 10.3390/e26050352.
6
Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks.基于语言特征和稀疏自学习神经网络的中文语义映射构建与研究。
Comput Intell Neurosci. 2022 Jun 20;2022:2315802. doi: 10.1155/2022/2315802. eCollection 2022.
7
Feature fusion network based on few-shot fine-grained classification.基于少样本细粒度分类的特征融合网络。
Front Neurorobot. 2023 Nov 9;17:1301192. doi: 10.3389/fnbot.2023.1301192. eCollection 2023.
8
Word Embedding Distribution Propagation Graph Network for Few-Shot Learning.基于词向量分布传播图网络的少样本学习方法。
Sensors (Basel). 2022 Mar 30;22(7):2648. doi: 10.3390/s22072648.
9
Contrastive Prototype-Guided Generation for Generalized Zero-Shot Learning.基于对比原型引导的广义零样本学习生成方法。
Neural Netw. 2024 Aug;176:106324. doi: 10.1016/j.neunet.2024.106324. Epub 2024 Apr 15.
10
Semantic-enhanced graph neural network for named entity recognition in ancient Chinese books.基于语义增强图神经网络的古籍命名实体识别
Sci Rep. 2024 Jul 30;14(1):17488. doi: 10.1038/s41598-024-68561-x.

引用本文的文献

1
Plant and Disease Recognition Based on PMF Pipeline Domain Adaptation Method: Using Bark Images as Meta-Dataset.基于概率矩阵分解(PMF)管道域适应方法的植物与病害识别:以树皮图像作为元数据集
Plants (Basel). 2023 Sep 15;12(18):3280. doi: 10.3390/plants12183280.

本文引用的文献

1
Study on the evolution of Chinese characters based on few-shot learning: From oracle bone inscriptions to regular script.基于少样本学习的汉字演变研究:从甲骨文到楷书。
PLoS One. 2022 Aug 19;17(8):e0272974. doi: 10.1371/journal.pone.0272974. eCollection 2022.
2
Relation classification via BERT with piecewise convolution and focal loss.基于分段卷积和焦点损失的 BERT 关系分类。
PLoS One. 2021 Sep 10;16(9):e0257092. doi: 10.1371/journal.pone.0257092. eCollection 2021.
3
Adaptive Prototypical Networks With Label Words and Joint Representation Learning for Few-Shot Relation Classification.
基于标签词和联合表示学习的自适应原型网络用于少样本关系分类
IEEE Trans Neural Netw Learn Syst. 2023 Mar;34(3):1406-1417. doi: 10.1109/TNNLS.2021.3105377. Epub 2023 Feb 28.
4
Meta-Learning in Neural Networks: A Survey.元学习在神经网络中的研究进展综述
IEEE Trans Pattern Anal Mach Intell. 2022 Sep;44(9):5149-5169. doi: 10.1109/TPAMI.2021.3079209. Epub 2022 Aug 4.
5
Channel-spatial attention network for fewshot classification.基于通道-空间注意力网络的小样本分类。
PLoS One. 2019 Dec 12;14(12):e0225426. doi: 10.1371/journal.pone.0225426. eCollection 2019.