• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种用于语义分割的基于高效采样的注意力网络。

An Efficient Sampling-Based Attention Network for Semantic Segmentation.

作者信息

He Xingjian, Liu Jing, Wang Weining, Lu Hanqing

出版信息

IEEE Trans Image Process. 2022;31:2850-2863. doi: 10.1109/TIP.2022.3162101. Epub 2022 Apr 5.

DOI:10.1109/TIP.2022.3162101
PMID:35353701
Abstract

Self-attention is widely explored to model long-range dependencies in semantic segmentation. However, this operation computes pair-wise relationships between the query point and all other points, leading to prohibitive complexity. In this paper, we propose an efficient Sampling-based Attention Network which combines a novel sample method with an attention mechanism for semantic segmentation. Specifically, we design a Stochastic Sampling-based Attention Module (SSAM) to capture the relationships between the query point and a stochastic sampled representative subset from a global perspective, where the sampled subset is selected by a Stochastic Sampling Module. Compared to self-attention, our SSAM achieves comparable segmentation performance while significantly reducing computational redundancy. In addition, with the observation that not all pixels are interested in the contextual information, we design a Deterministic Sampling-based Attention Module (DSAM) to sample features from a local region for obtaining the detailed information. Extensive experiments demonstrate that our proposed method can compete or perform favorably against the state-of-the-art methods on the Cityscapes, ADE20K, COCO Stuff, and PASCAL Context datasets.

摘要

自注意力机制在语义分割中被广泛用于对长距离依赖关系进行建模。然而,这种操作会计算查询点与所有其他点之间的成对关系,导致计算复杂度过高。在本文中,我们提出了一种高效的基于采样的注意力网络,该网络将一种新颖的采样方法与用于语义分割的注意力机制相结合。具体来说,我们设计了一个基于随机采样的注意力模块(SSAM),从全局角度捕捉查询点与随机采样的代表性子集之间的关系,其中采样子集由随机采样模块选择。与自注意力机制相比,我们的SSAM在显著减少计算冗余的同时,实现了相当的分割性能。此外,鉴于并非所有像素都对上下文信息感兴趣,我们设计了一个基于确定性采样的注意力模块(DSAM),从局部区域采样特征以获取详细信息。大量实验表明,我们提出的方法在Cityscapes、ADE20K、COCO Stuff和PASCAL Context数据集上能够与现有最先进方法竞争或表现更优。

相似文献

1
An Efficient Sampling-Based Attention Network for Semantic Segmentation.一种用于语义分割的基于高效采样的注意力网络。
IEEE Trans Image Process. 2022;31:2850-2863. doi: 10.1109/TIP.2022.3162101. Epub 2022 Apr 5.
2
CCNet: Criss-Cross Attention for Semantic Segmentation.CCNet:用于语义分割的交叉注意力。
IEEE Trans Pattern Anal Mach Intell. 2023 Jun;45(6):6896-6908. doi: 10.1109/TPAMI.2020.3007032. Epub 2023 May 5.
3
Multiple-Attention Mechanism Network for Semantic Segmentation.多注意力机制网络的语义分割。
Sensors (Basel). 2022 Jun 13;22(12):4477. doi: 10.3390/s22124477.
4
CTNet: Context-Based Tandem Network for Semantic Segmentation.CTNet:用于语义分割的基于上下文的串联网络
IEEE Trans Pattern Anal Mach Intell. 2022 Dec;44(12):9904-9917. doi: 10.1109/TPAMI.2021.3132068. Epub 2022 Nov 7.
5
SAB Net: A Semantic Attention Boosting Framework for Semantic Segmentation.SAB Net:一种用于语义分割的语义注意力增强框架。
IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):4029-4041. doi: 10.1109/TNNLS.2022.3144003. Epub 2025 Feb 28.
6
Scene Segmentation With Dual Relation-Aware Attention Network.基于双重关系感知注意力网络的场景分割。
IEEE Trans Neural Netw Learn Syst. 2021 Jun;32(6):2547-2560. doi: 10.1109/TNNLS.2020.3006524. Epub 2021 Jun 2.
7
Global Aggregation Then Local Distribution for Scene Parsing.用于场景解析的全局聚合然后局部分布
IEEE Trans Image Process. 2021;30:6829-6842. doi: 10.1109/TIP.2021.3099366.
8
Global-Guided Selective Context Network for Scene Parsing.基于全局引导的选择性上下文网络的场景解析。
IEEE Trans Neural Netw Learn Syst. 2022 Apr;33(4):1752-1764. doi: 10.1109/TNNLS.2020.3043808. Epub 2022 Apr 4.
9
Double Similarity Distillation for Semantic Image Segmentation.用于语义图像分割的双相似性蒸馏
IEEE Trans Image Process. 2021;30:5363-5376. doi: 10.1109/TIP.2021.3083113. Epub 2021 Jun 3.
10
Denoised Non-Local Neural Network for Semantic Segmentation.
IEEE Trans Neural Netw Learn Syst. 2024 May;35(5):7162-7174. doi: 10.1109/TNNLS.2022.3214216. Epub 2024 May 2.

引用本文的文献

1
Inverse Design of Nanophotonic Devices Using Generative Adversarial Networks with the Sim-NN Model and Self-Attention Mechanism.基于Sim-NN模型和自注意力机制的生成对抗网络用于纳米光子器件的逆向设计
Micromachines (Basel). 2023 Mar 10;14(3):634. doi: 10.3390/mi14030634.