Montreal Institute for Learning Algorithms (MILA), Canada; Aalto University, Finland.
Harvard University, USA.
Neural Netw. 2022 Jan;145:90-106. doi: 10.1016/j.neunet.2021.10.008. Epub 2021 Oct 21.
We introduce Interpolation Consistency Training (ICT), a simple and computation efficient algorithm for training Deep Neural Networks in the semi-supervised learning paradigm. ICT encourages the prediction at an interpolation of unlabeled points to be consistent with the interpolation of the predictions at those points. In classification problems, ICT moves the decision boundary to low-density regions of the data distribution. Our experiments show that ICT achieves state-of-the-art performance when applied to standard neural network architectures on the CIFAR-10 and SVHN benchmark datasets. Our theoretical analysis shows that ICT corresponds to a certain type of data-adaptive regularization with unlabeled points which reduces overfitting to labeled points under high confidence values.
我们引入了插值一致性训练(ICT),这是一种用于在半监督学习范例中训练深度神经网络的简单且计算效率高的算法。ICT 鼓励在未标记点的插值处的预测与这些点的预测的插值保持一致。在分类问题中,ICT 将决策边界移动到数据分布的低密度区域。我们的实验表明,当 ICT 应用于 CIFAR-10 和 SVHN 基准数据集上的标准神经网络架构时,它可以达到最先进的性能。我们的理论分析表明,ICT 对应于一种具有未标记点的数据自适应正则化类型,它可以在高置信值下减少对标记点的过拟合。