Suppr超能文献

用于大规模图像分类的判别层次 K-Means 树。

Discriminative Hierarchical K-Means Tree for Large-Scale Image Classification.

出版信息

IEEE Trans Neural Netw Learn Syst. 2015 Sep;26(9):2200-5. doi: 10.1109/TNNLS.2014.2366476. Epub 2014 Nov 20.

Abstract

A key challenge in large-scale image classification is how to achieve efficiency in terms of both computation and memory without compromising classification accuracy. The learning-based classifiers achieve the state-of-the-art accuracies, but have been criticized for the computational complexity that grows linearly with the number of classes. The nonparametric nearest neighbor (NN)-based classifiers naturally handle large numbers of categories, but incur prohibitively expensive computation and memory costs. In this brief, we present a novel classification scheme, i.e., discriminative hierarchical K-means tree (D-HKTree), which combines the advantages of both learning-based and NN-based classifiers. The complexity of the D-HKTree only grows sublinearly with the number of categories, which is much better than the recent hierarchical support vector machines-based methods. The memory requirement is the order of magnitude less than the recent Naïve Bayesian NN-based approaches. The proposed D-HKTree classification scheme is evaluated on several challenging benchmark databases and achieves the state-of-the-art accuracies, while with significantly lower computation cost and memory requirement.

摘要

大规模图像分类的一个关键挑战是如何在不牺牲分类精度的情况下,同时提高计算效率和内存效率。基于学习的分类器达到了最先进的精度,但由于其计算复杂度与类别的数量呈线性增长而受到批评。基于非参数最近邻(NN)的分类器自然可以处理大量类别,但计算和内存成本高得令人望而却步。在本简讯中,我们提出了一种新颖的分类方案,即判别式层次 K-均值树(D-HKTree),它结合了基于学习和基于 NN 的分类器的优点。D-HKTree 的复杂度与类别的数量呈亚线性增长,比最近的基于层次支持向量机的方法要好得多。所需的内存数量比最近的基于朴素贝叶斯 NN 的方法少一个数量级。所提出的 D-HKTree 分类方案在几个具有挑战性的基准数据库上进行了评估,达到了最先进的精度,同时计算成本和内存需求显著降低。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验