• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

深度无参注意力哈希检索图像。

Deep parameter-free attention hashing for image retrieval.

机构信息

College of Software, Xinjiang University, Urumqi, 830046, China.

College of Information Science and Engineering, Xinjiang University, Urumqi, 830046, China.

出版信息

Sci Rep. 2022 Apr 30;12(1):7082. doi: 10.1038/s41598-022-11217-5.

DOI:10.1038/s41598-022-11217-5
PMID:35490175
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9056524/
Abstract

Deep hashing method is widely applied in the field of image retrieval because of its advantages of low storage consumption and fast retrieval speed. There is a defect of insufficiency feature extraction when existing deep hashing method uses the convolutional neural network (CNN) to extract images semantic features. Some studies propose to add channel-based or spatial-based attention modules. However, embedding these modules into the network can increase the complexity of model and lead to over fitting in the training process. In this study, a novel deep parameter-free attention hashing (DPFAH) is proposed to solve these problems, that designs a parameter-free attention (PFA) module in ResNet18 network. PFA is a lightweight module that defines an energy function to measure the importance of each neuron and infers 3-D attention weights for feature map in a layer. A fast closed-form solution for this energy function proves that the PFA module does not add any parameters to the network. Otherwise, this paper designs a novel hashing framework that includes the hash codes learning branch and the classification branch to explore more label information. The like-binary codes are constrained by a regulation term to reduce the quantization error in the continuous relaxation. Experiments on CIFAR-10, NUS-WIDE and Imagenet-100 show that DPFAH method achieves better performance.

摘要

深度哈希方法由于其存储消耗低、检索速度快的优点,在图像检索领域得到了广泛的应用。现有的深度哈希方法在使用卷积神经网络(CNN)提取图像语义特征时,存在特征提取不足的缺陷。一些研究提出添加基于通道或基于空间的注意力模块。然而,将这些模块嵌入网络会增加模型的复杂性,并导致训练过程中的过拟合。在这项研究中,提出了一种新颖的无参数深度注意力哈希(DPFAH)方法来解决这些问题,该方法在 ResNet18 网络中设计了一个无参数注意力(PFA)模块。PFA 是一个轻量级模块,它定义了一个能量函数来衡量每个神经元的重要性,并为该层的特征图推断出 3D 注意力权重。该能量函数的快速闭式解证明 PFA 模块不会向网络添加任何参数。此外,本文设计了一种新颖的哈希框架,包括哈希码学习分支和分类分支,以探索更多的标签信息。相似的二进制码受到正则项的约束,以减少连续松弛中的量化误差。在 CIFAR-10、NUS-WIDE 和 Imagenet-100 上的实验表明,DPFAH 方法取得了更好的性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/082c37352880/41598_2022_11217_Fig15_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/e03be6bddcf6/41598_2022_11217_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/3c76259305da/41598_2022_11217_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/ad8d07f66a17/41598_2022_11217_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/f55f4e0cb407/41598_2022_11217_Figa_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/40fc2910e58c/41598_2022_11217_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/cccf409e1bb7/41598_2022_11217_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/b9150685d9d9/41598_2022_11217_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/a27234576080/41598_2022_11217_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/9f60373c3036/41598_2022_11217_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/24d7ac42d05b/41598_2022_11217_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/caff863172a0/41598_2022_11217_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/3e800ddfa022/41598_2022_11217_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/aa996ec91189/41598_2022_11217_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/5f82ac1d7dc8/41598_2022_11217_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/86990dce1c29/41598_2022_11217_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/082c37352880/41598_2022_11217_Fig15_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/e03be6bddcf6/41598_2022_11217_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/3c76259305da/41598_2022_11217_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/ad8d07f66a17/41598_2022_11217_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/f55f4e0cb407/41598_2022_11217_Figa_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/40fc2910e58c/41598_2022_11217_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/cccf409e1bb7/41598_2022_11217_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/b9150685d9d9/41598_2022_11217_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/a27234576080/41598_2022_11217_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/9f60373c3036/41598_2022_11217_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/24d7ac42d05b/41598_2022_11217_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/caff863172a0/41598_2022_11217_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/3e800ddfa022/41598_2022_11217_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/aa996ec91189/41598_2022_11217_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/5f82ac1d7dc8/41598_2022_11217_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/86990dce1c29/41598_2022_11217_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4c24/9056524/082c37352880/41598_2022_11217_Fig15_HTML.jpg

相似文献

1
Deep parameter-free attention hashing for image retrieval.深度无参注意力哈希检索图像。
Sci Rep. 2022 Apr 30;12(1):7082. doi: 10.1038/s41598-022-11217-5.
2
Hierarchical Recurrent Neural Hashing for Image Retrieval With Hierarchical Convolutional Features.基于层次卷积特征的层次递归神经网络哈希图像检索
IEEE Trans Image Process. 2018;27(1):106-120. doi: 10.1109/TIP.2017.2755766.
3
Triplet Deep Hashing with Joint Supervised Loss Based on Deep Neural Networks.基于深度神经网络的联合监督损失三重深度哈希。
Comput Intell Neurosci. 2019 Oct 9;2019:8490364. doi: 10.1155/2019/8490364. eCollection 2019.
4
Local Semantic-aware Deep Hashing with Hamming-isometric Quantization.基于汉明等距量化的局部语义感知深度哈希
IEEE Trans Image Process. 2018 Dec 21. doi: 10.1109/TIP.2018.2889269.
5
Scalable Deep Hashing for Large-scale Social Image Retrieval.用于大规模社交图像检索的可扩展深度哈希
IEEE Trans Image Process. 2019 Sep 16. doi: 10.1109/TIP.2019.2940693.
6
Deep Category-Level and Regularized Hashing With Global Semantic Similarity Learning.基于全局语义相似性学习的深度类别级正则哈希
IEEE Trans Cybern. 2021 Dec;51(12):6240-6252. doi: 10.1109/TCYB.2020.2964993. Epub 2021 Dec 22.
7
Deep Semantic Multimodal Hashing Network for Scalable Image-Text and Video-Text Retrievals.用于可扩展图像-文本和视频-文本检索的深度语义多模态哈希网络
IEEE Trans Neural Netw Learn Syst. 2023 Apr;34(4):1838-1851. doi: 10.1109/TNNLS.2020.2997020. Epub 2023 Apr 4.
8
Quadruplet-Based Deep Cross-Modal Hashing.四元组深度学习跨模态哈希。
Comput Intell Neurosci. 2021 Jul 2;2021:9968716. doi: 10.1155/2021/9968716. eCollection 2021.
9
Deep supervised hashing for gait retrieval.深度监督哈希用于步态检索。
F1000Res. 2021 Oct 12;10:1038. doi: 10.12688/f1000research.51368.2. eCollection 2021.
10
Multi-scale Triplet Hashing for Medical Image Retrieval.多尺度三重哈希在医学图像检索中的应用。
Comput Biol Med. 2023 Mar;155:106633. doi: 10.1016/j.compbiomed.2023.106633. Epub 2023 Feb 8.

引用本文的文献

1
CLIP-Based Adaptive Graph Attention Network for Large-Scale Unsupervised Multi-Modal Hashing Retrieval.基于 CLIP 的自适应图注意力网络的大规模无监督多模态哈希检索
Sensors (Basel). 2023 Mar 24;23(7):3439. doi: 10.3390/s23073439.

本文引用的文献

1
Deep Reinforcement Learning for Edge Service Placement in Softwarized Industrial Cyber-Physical System.用于软件定义工业网络物理系统中边缘服务放置的深度强化学习
IEEE Trans Industr Inform. 2021 Aug;17(8). doi: 10.1109/tii.2020.3041713.
2
Deep Semantic-Preserving Reconstruction Hashing for Unsupervised Cross-Modal Retrieval.用于无监督跨模态检索的深度语义保持重构哈希
Entropy (Basel). 2020 Nov 7;22(11):1266. doi: 10.3390/e22111266.
3
Deep Class-Wise Hashing: Semantics-Preserving Hashing via Class-Wise Loss.深度类别级哈希:通过类别级损失实现语义保留哈希
IEEE Trans Neural Netw Learn Syst. 2020 May;31(5):1681-1695. doi: 10.1109/TNNLS.2019.2921805. Epub 2019 Jul 10.
4
Deep Discrete Supervised Hashing.深度离散监督哈希。
IEEE Trans Image Process. 2018 Dec;27(12):5996-6009. doi: 10.1109/TIP.2018.2864894. Epub 2018 Aug 10.
5
SIFT Meets CNN: A Decade Survey of Instance Retrieval.SIFT 遇见 CNN:实例检索的十年调查。
IEEE Trans Pattern Anal Mach Intell. 2018 May;40(5):1224-1244. doi: 10.1109/TPAMI.2017.2709749.
6
Data-dependent hashing based on p-stable distribution.基于 p-稳定分布的数据相关哈希。
IEEE Trans Image Process. 2014 Dec;23(12):5033-46. doi: 10.1109/TIP.2014.2352458. Epub 2014 Aug 27.
7
Early and late mechanisms of surround suppression in striate cortex of macaque.猕猴纹状皮层中周边抑制的早期和晚期机制
J Neurosci. 2005 Dec 14;25(50):11666-75. doi: 10.1523/JNEUROSCI.3414-05.2005.