Xu Ping, Wang Yue, Chen Xiang, Tian Zhi
IEEE Trans Neural Netw Learn Syst. 2024 Dec;35(12):17987-17999. doi: 10.1109/TNNLS.2023.3310499. Epub 2024 Dec 2.
This article focuses on online kernel learning over a decentralized network. Each agent in the network receives online streaming data and collaboratively learns a globally optimal nonlinear prediction function in the reproducing kernel Hilbert space (RKHS). To overcome the curse of dimensionality issue in traditional online kernel learning, we utilize random feature (RF) mapping to convert the nonparametric kernel learning problem into a fixed-length parametric one in the RF space. We then propose a novel learning framework, named online decentralized kernel learning via linearized ADMM (ODKLA), to efficiently solve the online decentralized kernel learning problem. To enhance communication efficiency, we introduce quantization and censoring strategies in the communication stage, resulting in the quantized and communication-censored ODKLA (QC-ODKLA) algorithm. We theoretically prove that both ODKLA and QC-ODKLA can achieve the optimal sublinear regret over time slots. Through numerical experiments, we evaluate the learning effectiveness, communication efficiency, and computation efficiency of the proposed methods.
本文聚焦于分布式网络上的在线核学习。网络中的每个智能体接收在线流数据,并在再生核希尔伯特空间(RKHS)中协作学习全局最优非线性预测函数。为克服传统在线核学习中的维数灾难问题,我们利用随机特征(RF)映射将非参数核学习问题转化为RF空间中的固定长度参数问题。然后,我们提出了一种新颖的学习框架,即通过线性化交替方向乘子法(ADMM)进行在线分布式核学习(ODKLA),以有效解决在线分布式核学习问题。为提高通信效率,我们在通信阶段引入量化和删减策略,得到量化和通信删减的ODKLA(QC - ODKLA)算法。我们从理论上证明,ODKLA和QC - ODKLA在时隙上都能实现最优的次线性遗憾。通过数值实验,我们评估了所提方法的学习有效性、通信效率和计算效率。