Shao Hao, Wang Shunfang
School of Mathematics and Statistics, Yunnan Unverisity, Kunming 650504, China.
School of Information Science and Engineering, Yunnan Unverisity, Kunming 650504, China.
Entropy (Basel). 2023 Apr 27;25(5):727. doi: 10.3390/e25050727.
Recently, there has been a rapid increase in deep classification tasks, such as image recognition and target detection. As one of the most crucial components in Convolutional Neural Network (CNN) architectures, softmax arguably encourages CNN to achieve better performance in image recognition. Under this scheme, we present a conceptually intuitive learning objection function: Orthogonal-Softmax. The primary property of the loss function is to use a linear approximation model that is designed by Gram-Schmidt orthogonalization. Firstly, compared with the traditional softmax and Taylor-Softmax, Orthogonal-Softmax has a stronger relationship through orthogonal polynomials expansion. Secondly, a new loss function is advanced to acquire highly discriminative features for classification tasks. At last, we present a linear softmax loss to further promote the intra-class compactness and inter-class discrepancy simultaneously. The results of the widespread experimental discussion on four benchmark datasets manifest the validity of the presented method. Besides, we want to explore the non-ground truth samples in the future.
最近,深度分类任务(如图像识别和目标检测)增长迅速。作为卷积神经网络(CNN)架构中最关键的组件之一,softmax无疑促使CNN在图像识别中取得更好的性能。在此框架下,我们提出了一个概念直观的学习目标函数:正交softmax。该损失函数的主要特性是使用通过Gram-Schmidt正交化设计的线性近似模型。首先,与传统softmax和泰勒softmax相比,正交softmax通过正交多项式展开具有更强的关系。其次,提出了一种新的损失函数以获取用于分类任务的高判别性特征。最后,我们提出了一种线性softmax损失,以进一步同时促进类内紧凑性和类间差异。在四个基准数据集上广泛的实验讨论结果证明了所提方法的有效性。此外,我们希望在未来探索非真实样本。