IEEE Trans Image Process. 2016 Oct;25(10):4514-24. doi: 10.1109/TIP.2016.2593344. Epub 2016 Jul 19.
Hash-based nearest neighbor search has become attractive in many applications. However, the quantization in hashing usually degenerates the discriminative power when using Hamming distance ranking. Besides, for large-scale visual search, existing hashing methods cannot directly support the efficient search over the data with multiple sources, and while the literature has shown that adaptively incorporating complementary information from diverse sources or views can significantly boost the search performance. To address the problems, this paper proposes a novel and generic approach to building multiple hash tables with multiple views and generating fine-grained ranking results at bitwise and tablewise levels. For each hash table, a query-adaptive bitwise weighting is introduced to alleviate the quantization loss by simultaneously exploiting the quality of hash functions and their complement for nearest neighbor search. From the tablewise aspect, multiple hash tables are built for different data views as a joint index, over which a query-specific rank fusion is proposed to rerank all results from the bitwise ranking by diffusing in a graph. Comprehensive experiments on image search over three well-known benchmarks show that the proposed method achieves up to 17.11% and 20.28% performance gains on single and multiple table search over the state-of-the-art methods.
基于哈希的最近邻搜索在许多应用中变得很有吸引力。然而,哈希中的量化通常会降低汉明距离排序时的判别能力。此外,对于大规模视觉搜索,现有的哈希方法不能直接支持对多源数据的高效搜索,而文献表明,自适应地整合来自不同来源或视图的互补信息可以显著提高搜索性能。为了解决这些问题,本文提出了一种新颖而通用的方法,用于构建具有多个视图的多个哈希表,并在逐位和逐表级别生成细粒度的排序结果。对于每个哈希表,引入了查询自适应位加权,通过同时利用哈希函数及其互补的质量来缓解量化损失,从而进行最近邻搜索。从表的角度来看,为不同的数据视图构建了多个哈希表作为联合索引,提出了一种查询特定的排名融合,通过在图中扩散对逐位排名的所有结果进行重新排名。在三个著名基准上的图像搜索综合实验表明,该方法在单表和多表搜索中比最先进的方法分别提高了 17.11%和 20.28%的性能。