School of Information Engineering, Henan University of Science and Technology, Luoyang 471023, China.
Internet of Things & Smart City Innovation Platform, Zhuhai Fudan Innovation Institute, Zhuhai, China.
Comput Intell Neurosci. 2022 Jun 2;2022:9337209. doi: 10.1155/2022/9337209. eCollection 2022.
Adaptive algorithms are widely used because of their fast convergence rate for training deep neural networks (DNNs). However, the training cost becomes prohibitively expensive due to the computation of the full gradient when training complicated DNN. To reduce the computational cost, we present a stochastic block adaptive gradient online training algorithm in this study, called SBAG. In this algorithm, stochastic block coordinate descent and the adaptive learning rate are utilized at each iteration. We also prove that the regret bound of can be achieved via SBAG, in which is a time horizon. In addition, we use SBAG to train ResNet-34 and DenseNet-121 on CIFAR-10, respectively. The results demonstrate that SBAG has better training speed and generalized ability than other existing training methods.
自适应算法由于其在训练深度神经网络(DNN)方面的快速收敛速度而被广泛应用。然而,当训练复杂的 DNN 时,由于全梯度的计算,训练成本变得非常昂贵。为了降低计算成本,我们在本研究中提出了一种随机块自适应梯度在线训练算法,称为 SBAG。在这个算法中,随机块坐标下降和自适应学习率在每一次迭代中都会被使用。我们还证明了通过 SBAG 可以实现 的遗憾界,其中 是一个时间范围。此外,我们分别使用 SBAG 在 CIFAR-10 上训练 ResNet-34 和 DenseNet-121。结果表明,SBAG 具有比其他现有训练方法更好的训练速度和泛化能力。