• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

注意力感知多任务卷积神经网络

Attention-Aware Multi-Task Convolutional Neural Networks.

作者信息

Lyu Kejie, Li Yingming, Zhang Zhongfei

出版信息

IEEE Trans Image Process. 2019 Oct 4. doi: 10.1109/TIP.2019.2944522.

DOI:10.1109/TIP.2019.2944522
PMID:31603785
Abstract

Multi-task deep learning methods learn multiple tasks simultaneously and share representations amongst them, so information from related tasks improves learning within one task. The generalization capabilities of the produced models are substantially enhanced. Typical multi-task deep learning models usually share representations of different tasks in lower layers of the network, and separate representations of different tasks in higher layers. However, different groups of tasks always have different requirements for sharing representations, so the required design criterion does not necessarily guarantee that the obtained network architecture is optimal. In addition, most existing methods ignore the redundancy problem and lack the pre-screening process for representations before they are shared. Here, we propose a model called Attention-aware Multi-task Convolutional Neural Network, which automatically learns appropriate sharing through end-to-end training. The attention mechanism is introduced into our architecture to suppress redundant contents contained in the representations. The shortcut connection is adopted to preserve useful information. We evaluate our model by carrying out experiments on different task groups and different datasets. Our model demonstrates an improvement over existing techniques in many experiments, indicating the effectiveness and the robustness of the model. We also demonstrate the importance of attention mechanism and shortcut connection in our model.

摘要

多任务深度学习方法可同时学习多个任务并在它们之间共享表示,因此来自相关任务的信息可改善单个任务内的学习。所生成模型的泛化能力会得到显著增强。典型的多任务深度学习模型通常在网络的较低层共享不同任务的表示,并在较高层分离不同任务的表示。然而,不同的任务组对共享表示总是有不同的要求,因此所需的设计标准不一定能保证所获得的网络架构是最优的。此外,大多数现有方法忽略了冗余问题,并且在共享表示之前缺乏对其进行预筛选的过程。在此,我们提出一种名为注意力感知多任务卷积神经网络的模型,它通过端到端训练自动学习适当的共享方式。注意力机制被引入到我们的架构中以抑制表示中包含的冗余内容。采用捷径连接来保留有用信息。我们通过在不同任务组和不同数据集上进行实验来评估我们的模型。我们的模型在许多实验中都展示出优于现有技术的性能,表明了该模型的有效性和鲁棒性。我们还证明了注意力机制和捷径连接在我们模型中的重要性。

相似文献

1
Attention-Aware Multi-Task Convolutional Neural Networks.注意力感知多任务卷积神经网络
IEEE Trans Image Process. 2019 Oct 4. doi: 10.1109/TIP.2019.2944522.
2
A modality-collaborative convolution and transformer hybrid network for unpaired multi-modal medical image segmentation with limited annotations.一种用于具有有限标注的未配对多模态医学图像分割的模态协作卷积与Transformer混合网络。
Med Phys. 2023 Sep;50(9):5460-5478. doi: 10.1002/mp.16338. Epub 2023 Mar 15.
3
Learning Spatial-Spectral-Temporal EEG Representations with Deep Attentive-Recurrent-Convolutional Neural Networks for Pain Intensity Assessment.利用深度注意-递归-卷积神经网络学习空间-谱-时 EEG 表示,用于疼痛强度评估。
Neuroscience. 2022 Jan 15;481:144-155. doi: 10.1016/j.neuroscience.2021.11.034. Epub 2021 Nov 26.
4
DeepDistance: A multi-task deep regression model for cell detection in inverted microscopy images.深度距离:一种用于倒置显微镜图像中细胞检测的多任务深度回归模型。
Med Image Anal. 2020 Jul;63:101720. doi: 10.1016/j.media.2020.101720. Epub 2020 May 11.
5
Multi-Task Network Representation Learning.多任务网络表示学习
Front Neurosci. 2020 Jan 23;14:1. doi: 10.3389/fnins.2020.00001. eCollection 2020.
6
Multi-level and joint attention networks on brain functional connectivity for cross-cognitive prediction.用于跨认知预测的脑功能连接的多层次联合注意力网络。
Med Image Anal. 2023 Dec;90:102921. doi: 10.1016/j.media.2023.102921. Epub 2023 Aug 21.
7
Semi-Supervised Multi-View Deep Discriminant Representation Learning.半监督多视图深度判别表示学习
IEEE Trans Pattern Anal Mach Intell. 2021 Jul;43(7):2496-2509. doi: 10.1109/TPAMI.2020.2973634. Epub 2021 Jun 8.
8
3D Multi-Attention Guided Multi-Task Learning Network for Automatic Gastric Tumor Segmentation and Lymph Node Classification.用于自动胃肿瘤分割和淋巴结分类的3D多注意力引导多任务学习网络
IEEE Trans Med Imaging. 2021 Jun;40(6):1618-1631. doi: 10.1109/TMI.2021.3062902. Epub 2021 Jun 1.
9
Multi-task Learning for Neonatal Brain Segmentation Using 3D Dense-Unet with Dense Attention Guided by Geodesic Distance.使用由测地距离引导的密集注意力的3D密集U型网络进行新生儿脑部分割的多任务学习
Domain Adapt Represent Transf Med Image Learn Less Labels Imperfect Data (2019). 2019 Oct;11795:243-251. doi: 10.1007/978-3-030-33391-1_28. Epub 2019 Oct 13.
10
Locality preserving dense graph convolutional networks with graph context-aware node representations.具有图上下文感知节点表示的局部保持密集图卷积网络
Neural Netw. 2021 Nov;143:108-120. doi: 10.1016/j.neunet.2021.05.031. Epub 2021 Jun 2.