• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

快速有监督离散哈希。

Fast Supervised Discrete Hashing.

出版信息

IEEE Trans Pattern Anal Mach Intell. 2018 Feb;40(2):490-496. doi: 10.1109/TPAMI.2017.2678475. Epub 2017 Mar 7.

DOI:10.1109/TPAMI.2017.2678475
PMID:28287956
Abstract

Learning-based hashing algorithms are "hot topics" because they can greatly increase the scale at which existing methods operate. In this paper, we propose a new learning-based hashing method called "fast supervised discrete hashing" (FSDH) based on "supervised discrete hashing" (SDH). Regressing the training examples (or hash code) to the corresponding class labels is widely used in ordinary least squares regression. Rather than adopting this method, FSDH uses a very simple yet effective regression of the class labels of training examples to the corresponding hash code to accelerate the algorithm. To the best of our knowledge, this strategy has not previously been used for hashing. Traditional SDH decomposes the optimization into three sub-problems, with the most critical sub-problem - discrete optimization for binary hash codes - solved using iterative discrete cyclic coordinate descent (DCC), which is time-consuming. However, FSDH has a closed-form solution and only requires a single rather than iterative hash code-solving step, which is highly efficient. Furthermore, FSDH is usually faster than SDH for solving the projection matrix for least squares regression, making FSDH generally faster than SDH. For example, our results show that FSDH is about 12-times faster than SDH when the number of hashing bits is 128 on the CIFAR-10 data base, and FSDH is about 151-times faster than FastHash when the number of hashing bits is 64 on the MNIST data-base. Our experimental results show that FSDH is not only fast, but also outperforms other comparative methods.

摘要

基于学习的哈希算法是“热门话题”,因为它们可以大大提高现有方法的规模。在本文中,我们提出了一种新的基于学习的哈希方法,称为“快速有监督离散哈希”(FSDH),它基于“有监督离散哈希”(SDH)。回归训练示例(或哈希码)到相应的类别标签在普通最小二乘回归中被广泛使用。而不是采用这种方法,FSDH 使用训练示例的类别标签到相应的哈希码的非常简单但有效的回归来加速算法。据我们所知,这种策略以前没有用于哈希。传统的 SDH 将优化分解为三个子问题,最关键的子问题 - 用于二进制哈希码的离散优化 - 使用迭代离散循环坐标下降(DCC)来解决,这很耗时。然而,FSDH 具有闭式解,只需要单个而不是迭代的哈希码求解步骤,因此效率很高。此外,FSDH 通常比 SDH 更快地解决最小二乘回归的投影矩阵,使得 FSDH 通常比 SDH 更快。例如,当 CIFAR-10 数据库中的哈希位数量为 128 时,FSDH 比 SDH 快约 12 倍,当 MNIST 数据库中的哈希位数量为 64 时,FSDH 比 FastHash 快约 151 倍。我们的实验结果表明,FSDH 不仅速度快,而且优于其他比较方法。

相似文献

1
Fast Supervised Discrete Hashing.快速有监督离散哈希。
IEEE Trans Pattern Anal Mach Intell. 2018 Feb;40(2):490-496. doi: 10.1109/TPAMI.2017.2678475. Epub 2017 Mar 7.
2
Hadamard Coding for Supervised Discrete Hashing.用于监督离散哈希的哈达玛编码。
IEEE Trans Image Process. 2018 Jul 12. doi: 10.1109/TIP.2018.2855427.
3
Supervised Discrete Hashing With Relaxation.带松弛的监督离散哈希
IEEE Trans Neural Netw Learn Syst. 2018 Mar;29(3):608-617. doi: 10.1109/TNNLS.2016.2636870. Epub 2016 Dec 29.
4
Supervised hashing using graph cuts and boosted decision trees.基于图切割和提升决策树的监督哈希。
IEEE Trans Pattern Anal Mach Intell. 2015 Nov;37(11):2317-31. doi: 10.1109/TPAMI.2015.2404776.
5
Deep Discrete Supervised Hashing.深度离散监督哈希。
IEEE Trans Image Process. 2018 Dec;27(12):5996-6009. doi: 10.1109/TIP.2018.2864894. Epub 2018 Aug 10.
6
Efficient Semi-Supervised Multimodal Hashing With Importance Differentiation Regression.基于重要性差异回归的高效半监督多模态哈希算法
IEEE Trans Image Process. 2022;31:5881-5892. doi: 10.1109/TIP.2022.3203216. Epub 2022 Sep 13.
7
Hierarchical Recurrent Neural Hashing for Image Retrieval With Hierarchical Convolutional Features.基于层次卷积特征的层次递归神经网络哈希图像检索
IEEE Trans Image Process. 2018;27(1):106-120. doi: 10.1109/TIP.2017.2755766.
8
Triplet Deep Hashing with Joint Supervised Loss Based on Deep Neural Networks.基于深度神经网络的联合监督损失三重深度哈希。
Comput Intell Neurosci. 2019 Oct 9;2019:8490364. doi: 10.1155/2019/8490364. eCollection 2019.
9
Strongly Constrained Discrete Hashing.强约束离散哈希
IEEE Trans Image Process. 2020 Jan 9. doi: 10.1109/TIP.2020.2963952.
10
Fast Cross-Modal Hashing With Global and Local Similarity Embedding.具有全局和局部相似性嵌入的快速跨模态哈希
IEEE Trans Cybern. 2022 Oct;52(10):10064-10077. doi: 10.1109/TCYB.2021.3059886. Epub 2022 Sep 19.

引用本文的文献

1
An Efficient Supervised Deep Hashing Method for Image Retrieval.一种用于图像检索的高效监督深度哈希方法。
Entropy (Basel). 2022 Oct 7;24(10):1425. doi: 10.3390/e24101425.
2
Deep Disentangled Hashing with Momentum Triplets for Neuroimage Search.用于神经图像搜索的带动量三元组的深度解缠哈希
Med Image Comput Comput Assist Interv. 2020;12261:191-201. doi: 10.1007/978-3-030-59710-8_19. Epub 2020 Sep 29.
3
Weighted-Attribute Triplet Hashing for Large-Scale Similar Judicial Case Matching.用于大规模相似司法案件匹配的加权属性三元组哈希
Comput Intell Neurosci. 2021 Apr 16;2021:6650962. doi: 10.1155/2021/6650962. eCollection 2021.