Suppr超能文献

基于加速梯度下降的高效离散最优传输算法

Efficient Discrete Optimal Transport Algorithm by Accelerated Gradient Descent.

作者信息

An Dongsheng, Lei Na, Xu Xiaoyin, Gu Xianfeng

机构信息

Stony Brook University, NY, USA.

Dalian University of Technology, Liaoning, China.

出版信息

Proc AAAI Conf Artif Intell. 2022 Jun;36(9):10119-10128. doi: 10.1609/aaai.v36i9.21251.

Abstract

Optimal transport (OT) plays an essential role in various areas like machine learning and deep learning. However, computing discrete OT for large scale problems with adequate accuracy and efficiency is highly challenging. Recently, methods based on the Sinkhorn algorithm add an entropy regularizer to the prime problem and obtain a trade off between efficiency and accuracy. In this paper, we propose a novel algorithm based on Nesterov's smoothing technique to further improve the efficiency and accuracy in computing OT. Basically, the non-smooth c-transform of the Kantorovich potential is approximated by the smooth Log-Sum-Exp function, which smooths the original non-smooth Kantorovich dual functional. The smooth Kantorovich functional can be efficiently optimized by a fast proximal gradient method, the fast iterative shrinkage thresholding algorithm (FISTA). Theoretically, the computational complexity of the proposed method is given by , which is lower than current estimation of the Sinkhorn algorithm. Experimentally, compared with the Sinkhorn algorithm, our results demonstrate that the proposed method achieves faster convergence and better accuracy with the same parameter.

摘要

最优传输(OT)在机器学习和深度学习等各个领域发挥着重要作用。然而,以足够的精度和效率计算大规模问题的离散OT极具挑战性。最近,基于Sinkhorn算法的方法在原始问题中添加了一个熵正则化项,并在效率和精度之间取得了平衡。在本文中,我们提出了一种基于Nesterov平滑技术的新算法,以进一步提高计算OT时的效率和精度。基本上,通过光滑的对数和指数函数逼近康德罗维奇势的非光滑c变换,从而使原始的非光滑康德罗维奇对偶泛函变得平滑。光滑的康德罗维奇泛函可以通过快速近端梯度法,即快速迭代收缩阈值算法(FISTA)进行有效优化。从理论上讲,所提方法的计算复杂度为 ,低于目前对Sinkhorn算法的估计。在实验上,与Sinkhorn算法相比,我们的结果表明所提方法在相同参数下收敛更快且精度更高。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9472/10652357/8ce95e2e4b46/nihms-1943443-f0001.jpg

相似文献

2
Optimized first-order methods for smooth convex minimization.用于光滑凸最小化的优化一阶方法。
Math Program. 2016 Sep;159(1):81-107. doi: 10.1007/s10107-015-0949-3. Epub 2015 Oct 17.
5
Adaptive Restart of the Optimized Gradient Method for Convex Optimization.用于凸优化的优化梯度法的自适应重启
J Optim Theory Appl. 2018 Jul;178(1):240-263. doi: 10.1007/s10957-018-1287-4. Epub 2018 May 7.
6
The Strength of Nesterov's Extrapolation in the Individual Convergence of Nonsmooth Optimization.涅斯捷罗夫外推法在非光滑优化个体收敛中的强度
IEEE Trans Neural Netw Learn Syst. 2020 Jul;31(7):2557-2568. doi: 10.1109/TNNLS.2019.2933452. Epub 2019 Sep 2.

本文引用的文献

2
Optimal Transport for Domain Adaptation.最优传输在域适应中的应用。
IEEE Trans Pattern Anal Mach Intell. 2017 Sep;39(9):1853-1865. doi: 10.1109/TPAMI.2016.2615921. Epub 2016 Oct 7.

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验