Bauer Alexander, Nakajima Shinichi, Muller Klaus-Robert
IEEE Trans Neural Netw Learn Syst. 2017 Nov;28(11):2566-2579. doi: 10.1109/TNNLS.2016.2598721. Epub 2016 Aug 19.
Structural support vector machine (SVM) is an elegant approach for building complex and accurate models with structured outputs. However, its applicability relies on the availability of efficient inference algorithms--the state-of-the-art training algorithms repeatedly perform inference to compute a subgradient or to find the most violating configuration. In this paper, we propose an exact inference algorithm for maximizing nondecomposable objectives due to special type of a high-order potential having a decomposable internal structure. As an important application, our method covers the loss augmented inference, which enables the slack and margin scaling formulations of structural SVM with a variety of dissimilarity measures, e.g., Hamming loss, precision and recall, Fβ-loss, intersection over union, and many other functions that can be efficiently computed from the contingency table. We demonstrate the advantages of our approach in natural language parsing and sequence segmentation applications.
结构化支持向量机(SVM)是一种用于构建具有结构化输出的复杂且精确模型的优雅方法。然而,其适用性依赖于高效推理算法的可用性——目前最先进的训练算法会反复执行推理以计算次梯度或找到最违反约束的配置。在本文中,由于一种具有可分解内部结构的特殊高阶势,我们提出了一种用于最大化不可分解目标的精确推理算法。作为一个重要应用,我们的方法涵盖了损失增强推理,它能够使用各种不同的相似性度量(例如汉明损失、精确率和召回率、Fβ损失、交并比以及许多其他可以从列联表高效计算的函数)来实现结构化SVM的松弛和间隔缩放公式。我们在自然语言解析和序列分割应用中展示了我们方法的优势。