• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于词向量分布传播图网络的少样本学习方法。

Word Embedding Distribution Propagation Graph Network for Few-Shot Learning.

机构信息

College of Computer Science and Technology, Changchun University of Science and Technology, Changchun 130022, China.

出版信息

Sensors (Basel). 2022 Mar 30;22(7):2648. doi: 10.3390/s22072648.

DOI:10.3390/s22072648
PMID:35408261
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9002792/
Abstract

Few-shot learning (FSL) is of great significance to the field of machine learning. The ability to learn and generalize using a small number of samples is an obvious distinction between artificial intelligence and humans. In the FSL domain, most graph neural networks (GNNs) focus on transferring labeled sample information to an unlabeled query sample, ignoring the important role of semantic information during the classification process. Our proposed method embeds semantic information of classes into a GNN, creating a word embedding distribution propagation graph network (WPGN) for FSL. We merge the attention mechanism with our backbone network, use the Mahalanobis distance to calculate the similarity of classes, select the Funnel ReLU (FReLU) function as the activation function of the Transform layer, and update the point graph and word embedding distribution graph. In extensive experiments on FSL benchmarks, compared with the baseline model, the accuracy of the WPGN on the 5-way-1/2/5 shot tasks increased by 9.03, 4.56, and 4.15%, respectively.

摘要

少样本学习 (FSL) 在机器学习领域具有重要意义。使用少量样本进行学习和泛化的能力是人工智能与人类的明显区别。在 FSL 领域,大多数图神经网络 (GNN) 专注于将标记样本信息转移到未标记查询样本上,而忽略了分类过程中语义信息的重要作用。我们提出的方法将类别的语义信息嵌入到 GNN 中,为 FSL 创建了一个词嵌入分布传播图网络 (WPGN)。我们将注意力机制与骨干网络融合,使用马氏距离计算类别的相似度,选择 Funnel ReLU (FReLU) 函数作为 Transform 层的激活函数,并更新点图和词嵌入分布图。在 FSL 基准上的广泛实验中,与基线模型相比,WPGN 在 5-way-1/2/5 shot 任务上的准确率分别提高了 9.03%、4.56%和 4.15%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/8c6a77e1226f/sensors-22-02648-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/9d1264bff66f/sensors-22-02648-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/5c668d52fb38/sensors-22-02648-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/17052060cd45/sensors-22-02648-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/68c350ce8ae8/sensors-22-02648-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/535e22b8721a/sensors-22-02648-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/28a1075e14f7/sensors-22-02648-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/de5009b85816/sensors-22-02648-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/900b612d5dbe/sensors-22-02648-g008a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/dfb3bacb2dbc/sensors-22-02648-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/58697027fb00/sensors-22-02648-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/8c6a77e1226f/sensors-22-02648-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/9d1264bff66f/sensors-22-02648-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/5c668d52fb38/sensors-22-02648-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/17052060cd45/sensors-22-02648-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/68c350ce8ae8/sensors-22-02648-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/535e22b8721a/sensors-22-02648-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/28a1075e14f7/sensors-22-02648-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/de5009b85816/sensors-22-02648-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/900b612d5dbe/sensors-22-02648-g008a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/dfb3bacb2dbc/sensors-22-02648-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/58697027fb00/sensors-22-02648-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cad6/9002792/8c6a77e1226f/sensors-22-02648-g011.jpg

相似文献

1
Word Embedding Distribution Propagation Graph Network for Few-Shot Learning.基于词向量分布传播图网络的少样本学习方法。
Sensors (Basel). 2022 Mar 30;22(7):2648. doi: 10.3390/s22072648.
2
LGLNN: Label Guided Graph Learning-Neural Network for few-shot learning.LGLNN:基于标签引导图学习神经网络的少样本学习。
Neural Netw. 2022 Nov;155:50-57. doi: 10.1016/j.neunet.2022.08.003. Epub 2022 Aug 6.
3
Knowledge-Guided Multi-Label Few-Shot Learning for General Image Recognition.基于知识引导的通用图像识别的多标签少样本学习。
IEEE Trans Pattern Anal Mach Intell. 2022 Mar;44(3):1371-1384. doi: 10.1109/TPAMI.2020.3025814. Epub 2022 Feb 3.
4
Multi-label zero-shot learning with graph convolutional networks.基于图卷积网络的多标签零样本学习。
Neural Netw. 2020 Dec;132:333-341. doi: 10.1016/j.neunet.2020.09.010. Epub 2020 Sep 21.
5
DropAGG: Robust Graph Neural Networks via Drop Aggregation.DropAGG:通过随机聚合的鲁棒图神经网络。
Neural Netw. 2023 Jun;163:65-74. doi: 10.1016/j.neunet.2023.03.022. Epub 2023 Mar 29.
6
Few-Shot Fine-Grained Image Classification via GNN.基于图神经网络的少样本细粒度图像分类。
Sensors (Basel). 2022 Oct 9;22(19):7640. doi: 10.3390/s22197640.
7
Transductive Relation-Propagation With Decoupling Training for Few-Shot Learning.用于少样本学习的解耦训练的转导关系传播
IEEE Trans Neural Netw Learn Syst. 2022 Nov;33(11):6652-6664. doi: 10.1109/TNNLS.2021.3082928. Epub 2022 Oct 27.
8
Bias-Eliminated Semantic Refinement for Any-Shot Learning.无偏语义细化的任意-shot 学习。
IEEE Trans Image Process. 2022;31:2229-2244. doi: 10.1109/TIP.2022.3152631. Epub 2022 Mar 8.
9
Mutual Correlation Network for few-shot learning.基于互相关联网络的小样本学习。
Neural Netw. 2024 Jul;175:106289. doi: 10.1016/j.neunet.2024.106289. Epub 2024 Apr 3.
10
SP-GNN: Learning structure and position information from graphs.SP-GNN:从图中学习结构和位置信息。
Neural Netw. 2023 Apr;161:505-514. doi: 10.1016/j.neunet.2023.01.051. Epub 2023 Feb 4.

本文引用的文献

1
Multi-level Semantic Feature Augmentation for One-shot Learning.用于一次性学习的多级语义特征增强
IEEE Trans Image Process. 2019 Apr 9. doi: 10.1109/TIP.2019.2910052.