• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

深度自监督哈希算法在图像检索中的应用。

Deep Self-Taught Hashing for Image Retrieval.

出版信息

IEEE Trans Cybern. 2019 Jun;49(6):2229-2241. doi: 10.1109/TCYB.2018.2822781. Epub 2018 May 4.

DOI:10.1109/TCYB.2018.2822781
PMID:29994014
Abstract

Hashing algorithm has been widely used to speed up image retrieval due to its compact binary code and fast distance calculation. The combination with deep learning boosts the performance of hashing by learning accurate representations and complicated hashing functions. So far, the most striking success in deep hashing have mostly involved discriminative models, which require labels. To apply deep hashing on datasets without labels, we propose a deep self-taught hashing algorithm (DSTH), which generates a set of pseudo labels by analyzing the data itself, and then learns the hash functions for novel data using discriminative deep models. Furthermore, we generalize DSTH to support both supervised and unsupervised cases by adaptively incorporating label information. We use two different deep learning framework to train the hash functions to deal with out-of-sample problem and reduce the time complexity without loss of accuracy. We have conducted extensive experiments to investigate different settings of DSTH, and compared it with state-of-the-art counterparts in six publicly available datasets. The experimental results show that DSTH outperforms the others in all datasets.

摘要

哈希算法因其紧凑的二进制代码和快速的距离计算而被广泛用于加速图像检索。与深度学习的结合通过学习准确的表示和复杂的哈希函数来提高哈希的性能。到目前为止,深度学习哈希的最显著成功主要涉及判别模型,这些模型需要标签。为了在没有标签的数据集上应用深度学习哈希,我们提出了一种深度自监督哈希算法(DSTH),该算法通过分析数据本身生成一组伪标签,然后使用判别深度学习模型学习新数据的哈希函数。此外,我们通过自适应地结合标签信息,将 DSTH 推广到支持有监督和无监督的情况。我们使用两种不同的深度学习框架来训练哈希函数,以解决样本外问题并在不损失准确性的情况下降低时间复杂度。我们进行了广泛的实验来研究 DSTH 的不同设置,并在六个公开可用的数据集上将其与最先进的方法进行了比较。实验结果表明,DSTH 在所有数据集上的表现都优于其他方法。

相似文献

1
Deep Self-Taught Hashing for Image Retrieval.深度自监督哈希算法在图像检索中的应用。
IEEE Trans Cybern. 2019 Jun;49(6):2229-2241. doi: 10.1109/TCYB.2018.2822781. Epub 2018 May 4.
2
Simultaneous Feature Aggregating and Hashing for Compact Binary Code Learning.用于紧凑二进制码学习的同步特征聚合与哈希
IEEE Trans Image Process. 2019 Oct;28(10):4954-4969. doi: 10.1109/TIP.2019.2913509. Epub 2019 May 8.
3
Scalable Deep Hashing for Large-scale Social Image Retrieval.用于大规模社交图像检索的可扩展深度哈希
IEEE Trans Image Process. 2019 Sep 16. doi: 10.1109/TIP.2019.2940693.
4
Deep Discrete Supervised Hashing.深度离散监督哈希。
IEEE Trans Image Process. 2018 Dec;27(12):5996-6009. doi: 10.1109/TIP.2018.2864894. Epub 2018 Aug 10.
5
Hierarchical Recurrent Neural Hashing for Image Retrieval With Hierarchical Convolutional Features.基于层次卷积特征的层次递归神经网络哈希图像检索
IEEE Trans Image Process. 2018;27(1):106-120. doi: 10.1109/TIP.2017.2755766.
6
Instance-Aware Hashing for Multi-Label Image Retrieval.实例感知哈希多标签图像检索。
IEEE Trans Image Process. 2016 Jun;25(6):2469-79. doi: 10.1109/TIP.2016.2545300. Epub 2016 Mar 22.
7
Triplet Deep Hashing with Joint Supervised Loss Based on Deep Neural Networks.基于深度神经网络的联合监督损失三重深度哈希。
Comput Intell Neurosci. 2019 Oct 9;2019:8490364. doi: 10.1155/2019/8490364. eCollection 2019.
8
Unsupervised Semantic-Preserving Adversarial Hashing for Image Search.用于图像搜索的无监督语义保持对抗哈希
IEEE Trans Image Process. 2019 Aug;28(8):4032-4044. doi: 10.1109/TIP.2019.2903661. Epub 2019 Mar 13.
9
Unsupervised Deep Hashing with Similarity-Adaptive and Discrete Optimization.基于相似性自适应和离散优化的无监督深度哈希
IEEE Trans Pattern Anal Mach Intell. 2018 Dec;40(12):3034-3044. doi: 10.1109/TPAMI.2018.2789887. Epub 2018 Jan 5.
10
Exploring Auxiliary Context: Discrete Semantic Transfer Hashing for Scalable Image Retrieval.探索辅助上下文:用于可扩展图像检索的离散语义转移哈希
IEEE Trans Neural Netw Learn Syst. 2018 Nov;29(11):5264-5276. doi: 10.1109/TNNLS.2018.2797248. Epub 2018 Feb 14.