Suppr超能文献

高效的 IntVec:以降低的计算成本实现高识别率。

Efficient IntVec: High recognition rate with reduced computational cost.

机构信息

Fuzzy Logic Systems Institute, 680-41 Kawazu, Iizuka, Fukuoka, 820-0067, Japan.

出版信息

Neural Netw. 2019 Nov;119:323-331. doi: 10.1016/j.neunet.2019.08.024. Epub 2019 Aug 30.

Abstract

In many deep neural networks for pattern recognition, the input pattern is classified in the deepest layer based on features extracted through intermediate layers. IntVec (interpolating-vector) is known to be a powerful method for this process of classification. Although the recognition error can be made much smaller by IntVec than by WTA (winner-take-all) or even by SVM (support vector machines), IntVec requires a large computational cost. This paper proposes a new method, by which the computational cost by IntVec can be reduced drastically without increasing the recognition error. Although we basically use IntVec for recognition, we substitute it with WTA, which requires much smaller computational cost, under a certain condition. To be more specific, we first try to classify the input vector using WTA. If a class is a complete loser by WTA, we judge it also a loser by IntVec and omit the calculation of IntVec for that class. If a class is an unrivaled winner by WTA, calculation of IntVec itself can be omitted for all classes.

摘要

在许多用于模式识别的深度神经网络中,输入模式根据通过中间层提取的特征在最深层进行分类。IntVec(插值向量)是一种众所周知的强大方法,可用于此分类过程。虽然与 WTA(胜者全拿)甚至 SVM(支持向量机)相比,IntVec 可以使识别错误小得多,但 IntVec 需要大量的计算成本。本文提出了一种新方法,可以在不增加识别错误的情况下,大大降低 IntVec 的计算成本。虽然我们基本上使用 IntVec 进行识别,但在满足一定条件下,我们会用计算成本小得多的 WTA 替代它。更具体地说,我们首先尝试使用 WTA 对输入向量进行分类。如果一个类在 WTA 中完全失败,我们也判断它在 IntVec 中失败,并忽略该类的 IntVec 计算。如果一个类在 WTA 中是无与伦比的胜者,则可以省略所有类的 IntVec 本身的计算。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验