Zhao Hui, An Jing, Yu Mengjie, Lv Diankai, Kuang Kaida, Zhang Tianqi
Appl Opt. 2021 Aug 20;60(24):7177-7185. doi: 10.1364/AO.428465.
In order to improve the wavefront distortion correction performance of the classical stochastic parallel gradient descent (SPGD) algorithm, an optimized algorithm based on Nesterov-accelerated adaptive momentum estimation is proposed. It adopts a modified second-order momentum and a linearly varying gain coefficient to improve iterative stability. It integrates the Nesterov momentum term and the modified Adam optimizer to further improve the convergence speed, correct the direction of gradient descent in a timely fashion, and avoid falling into local extremum. Besides, to demonstrate the algorithm's performance, a wavefront sensorless adaptive optics system model is established using a 6×6 element deformable mirror as wavefront corrector. Simulation results show that, compared with the SPGD algorithm, the proposed algorithm converges faster, and its Strehl ratio after convergence is nearly 6.25 times that of the SPGD algorithm. Also, the effectiveness and superiority of the proposed algorithm are verified by comparing with two existing optimization algorithms.
为了提高经典随机并行梯度下降(SPGD)算法的波前畸变校正性能,提出了一种基于Nesterov加速自适应动量估计的优化算法。该算法采用修正的二阶动量和线性变化的增益系数来提高迭代稳定性。它将Nesterov动量项与修正的Adam优化器相结合,进一步提高收敛速度,及时校正梯度下降方向,避免陷入局部极值。此外,为了验证该算法的性能,使用一个6×6单元的变形镜作为波前校正器建立了无波前传感器自适应光学系统模型。仿真结果表明,与SPGD算法相比,该算法收敛速度更快,收敛后的斯特列尔比几乎是SPGD算法的6.25倍。同时,通过与两种现有优化算法的比较,验证了该算法的有效性和优越性。