Department of Information and Electrical Engineering, Ludong University, Yantai, 264000, China; Yantai Research Institute of New Generation Information Technology, Southwest Jiaotong University, 264000, China.
Yantai Research Institute of New Generation Information Technology, Southwest Jiaotong University, 264000, China.
Neural Netw. 2020 May;125:142-152. doi: 10.1016/j.neunet.2020.01.035. Epub 2020 Feb 11.
Supervised cross-modal hashing has attracted widespread concentrations for large-scale retrieval task due to its promising retrieval performance. However, most existing works suffer from some of following issues. Firstly, most of them only leverage the pair-wise similarity matrix to learn hash codes, which may result in class information loss. Secondly, the pair-wise similarity matrix generally lead to high computing complexity and memory cost. Thirdly, most of them relax the discrete constraints during optimization, which generally results in large cumulative quantization error and consequent inferior hash codes. To address above problems, we present a Fast Discrete Cross-modal Hashing method in this paper, FDCH for short. Specifically, it firstly leverages both class labels and the pair-wise similarity matrix to learn a sharing Hamming space where the semantic consistency can be better preserved. Then we propose an asymmetric hash codes learning model to avoid the challenging issue of symmetric matrix factorization. Finally, an effective and efficient discrete optimal scheme is designed to generate discrete hash codes directly, and the computing complexity and memory cost caused by the pair-wise similarity matrix are reduced from O(n) to O(n), where n denotes the size of training set. Extensive experiments conducted on three real world datasets highlight the superiority of FDCH compared with several cross-modal hashing methods and demonstrate its effectiveness and efficiency.
监督跨模态哈希因其在大规模检索任务中的出色检索性能而受到广泛关注。然而,大多数现有工作存在以下一些问题。首先,它们大多仅利用对相似矩阵来学习哈希码,这可能导致类信息丢失。其次,对相似矩阵通常导致高计算复杂度和内存成本。第三,它们大多数在优化过程中放宽离散约束,这通常会导致较大的累积量化误差,从而导致较差的哈希码。为了解决上述问题,我们在本文中提出了一种快速离散跨模态哈希方法,简称 FDCH。具体来说,它首先利用类标签和对相似矩阵来学习共享的汉明空间,从而更好地保留语义一致性。然后,我们提出了一种非对称哈希码学习模型,以避免对称矩阵分解的挑战性问题。最后,设计了一种有效且高效的离散最优方案,直接生成离散哈希码,并将对相似矩阵引起的计算复杂度和内存成本从 O(n)降低到 O(n),其中 n 表示训练集的大小。在三个真实数据集上进行的广泛实验表明,FDCH 优于几种跨模态哈希方法,并证明了其有效性和效率。