Suppr超能文献

无需迭代调整的序数神经网络。

Ordinal neural networks without iterative tuning.

出版信息

IEEE Trans Neural Netw Learn Syst. 2014 Nov;25(11):2075-85. doi: 10.1109/TNNLS.2014.2304976.

Abstract

Ordinal regression (OR) is an important branch of supervised learning in between the multiclass classification and regression. In this paper, the traditional classification scheme of neural network is adapted to learn ordinal ranks. The model proposed imposes monotonicity constraints on the weights connecting the hidden layer with the output layer. To do so, the weights are transcribed using padding variables. This reformulation leads to the so-called inequality constrained least squares (ICLS) problem. Its numerical solution can be obtained by several iterative methods, for example, trust region or line search algorithms. In this proposal, the optimum is determined analytically according to the closed-form solution of the ICLS problem estimated from the Karush-Kuhn-Tucker conditions. Furthermore, following the guidelines of the extreme learning machine framework, the weights connecting the input and the hidden layers are randomly generated, so the final model estimates all its parameters without iterative tuning. The model proposed achieves competitive performance compared with the state-of-the-art neural networks methods for OR.

摘要

有序回归(OR)是介于多类分类和回归之间的监督学习的一个重要分支。在本文中,我们将神经网络的传统分类方案应用于学习有序等级。所提出的模型对连接隐藏层和输出层的权重施加单调性约束。为此,使用填充变量对权重进行转录。这种重新表述导致了所谓的不等式约束最小二乘法(ICLS)问题。其数值解可以通过几种迭代方法获得,例如信赖域或线搜索算法。在本提案中,根据从 Karush-Kuhn-Tucker 条件估计的 ICLS 问题的闭式解,通过解析方法确定最优值。此外,根据极限学习机框架的指导原则,连接输入和隐藏层的权重是随机生成的,因此最终模型无需迭代调整即可估计所有参数。与 OR 的最先进的神经网络方法相比,所提出的模型具有竞争力。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验