Chen Kai, van Laarhoven Twan, Groot Perry, Chen Jinsong, Marchiori Elena
IEEE Trans Neural Netw Learn Syst. 2020 Dec;31(12):5613-5623. doi: 10.1109/TNNLS.2020.2980779. Epub 2020 Nov 30.
Multitask Gaussian processes (MTGPs) are a powerful approach for modeling dependencies between multiple related tasks or functions for joint regression. Current kernels for MTGPs cannot fully model nonlinear task correlations and other types of dependencies. In this article, we address this limitation. We focus on spectral mixture (SM) kernels and propose an enhancement of this type of kernels, called multitask generalized convolution SM (MT-GCSM) kernel. The MT-GCSM kernel can model nonlinear task correlations and dependence between components, including time and phase delay dependence. Each task in MT-GCSM has its GCSM kernel with its number of convolution structures, and dependencies between all components from different tasks are considered. Another constraint of current kernels for MTGPs is that components from different tasks are aligned. Here, we lift this constraint by using inner and outer full cross convolution between a base component and the reversed complex conjugate of another base component. Extensive experiments on two synthetic and three real-life data sets illustrate the difference between MT-GCSM and previous SM kernels as well as the practical effectiveness of MT-GCSM.
多任务高斯过程(MTGPs)是一种用于对多个相关任务或函数之间的依赖关系进行建模以进行联合回归的强大方法。当前用于MTGPs的核不能完全对非线性任务相关性和其他类型的依赖关系进行建模。在本文中,我们解决了这一局限性。我们专注于谱混合(SM)核,并提出了这种类型核的一种增强形式,称为多任务广义卷积SM(MT-GCSM)核。MT-GCSM核可以对非线性任务相关性以及组件之间的依赖关系进行建模,包括时间和相位延迟依赖关系。MT-GCSM中的每个任务都有其具有卷积结构数量的GCSM核,并且考虑了来自不同任务的所有组件之间的依赖关系。当前用于MTGPs的核的另一个限制是不同任务的组件是对齐的。在这里,我们通过在一个基础组件和另一个基础组件的反向复共轭之间使用内全交叉卷积和外全交叉卷积来解除这一限制。在两个合成数据集和三个真实数据集上进行的大量实验说明了MT-GCSM与先前的SM核之间的差异以及MT-GCSM的实际有效性。