Liu Pingping, Shi Lida, Miao Zhuang, Jin Baixin, Zhou Qiuzhan
College of Computer Science and Technology, Jilin University, Changchun 130012, China.
Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, Changchun 130012, China.
Entropy (Basel). 2020 Mar 11;22(3):321. doi: 10.3390/e22030321.
Convolutional neural networks (CNN) is the most mainstream solution in the field of image retrieval. Deep metric learning is introduced into the field of image retrieval, focusing on the construction of pair-based loss function. However, most pair-based loss functions of metric learning merely take common vector similarity (such as Euclidean distance) of the final image descriptors into consideration, while neglecting other distribution characters of these descriptors. In this work, we propose relative distribution entropy (RDE) to describe the internal distribution attributes of image descriptors. We combine relative distribution entropy with the Euclidean distance to obtain the relative distribution entropy weighted distance (RDE-distance). Moreover, the RDE-distance is fused with the contrastive loss and triplet loss to build the relative distributed entropy loss functions. The experimental results demonstrate that our method attains the state-of-the-art performance on most image retrieval benchmarks.
卷积神经网络(CNN)是图像检索领域最主流的解决方案。深度度量学习被引入图像检索领域,重点在于基于对的损失函数的构建。然而,度量学习的大多数基于对的损失函数仅考虑最终图像描述符的普通向量相似度(如欧几里得距离),而忽略了这些描述符的其他分布特征。在这项工作中,我们提出相对分布熵(RDE)来描述图像描述符的内部分布属性。我们将相对分布熵与欧几里得距离相结合,得到相对分布熵加权距离(RDE-距离)。此外,RDE-距离与对比损失和三元组损失相融合,构建相对分布熵损失函数。实验结果表明,我们的方法在大多数图像检索基准上达到了当前最优的性能。