Suppr超能文献

基于分类记忆网络的细粒度分类

Fine-Grained Classification via Categorical Memory Networks.

作者信息

Deng Weijian, Marsh Joshua, Gould Stephen, Zheng Liang

出版信息

IEEE Trans Image Process. 2022;31:4186-4196. doi: 10.1109/TIP.2022.3181492. Epub 2022 Jun 20.

Abstract

Motivated by the desire to exploit patterns shared across classes, we present a simple yet effective class-specific memory module for fine-grained feature learning. The memory module stores the prototypical feature representation for each category as a moving average. We hypothesize that the combination of similarities with respect to each category is itself a useful discriminative cue. To detect these similarities, we use attention as a querying mechanism. The attention scores with respect to each class prototype are used as weights to combine prototypes via weighted sum, producing a uniquely tailored response feature representation for a given input. The original and response features are combined to produce an augmented feature for classification. We integrate our class-specific memory module into a standard convolutional neural network, yielding a Categorical Memory Network. Our memory module significantly improves accuracy over baseline CNNs, achieving competitive accuracy with state-of-the-art methods on four benchmarks, including CUB-200-2011, Stanford Cars, FGVC Aircraft, and NABirds.

摘要

出于利用跨类别共享模式的愿望,我们提出了一种简单而有效的特定类别记忆模块,用于细粒度特征学习。该记忆模块将每个类别的原型特征表示存储为移动平均值。我们假设每个类别的相似性组合本身就是一个有用的判别线索。为了检测这些相似性,我们使用注意力作为查询机制。相对于每个类原型的注意力分数用作权重,通过加权求和来组合原型,从而为给定输入生成唯一定制的响应特征表示。原始特征和响应特征相结合以产生用于分类的增强特征。我们将特定类别记忆模块集成到标准卷积神经网络中,得到一个类别记忆网络。我们的记忆模块相对于基线卷积神经网络显著提高了准确率,在包括CUB - 200 - 2011、斯坦福汽车、FGVC飞机和NABirds在内的四个基准测试中,与当前最先进的方法相比达到了有竞争力的准确率。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验