Li Wendi, Wang Ting, Ng Wing W Y
IEEE Trans Neural Netw Learn Syst. 2023 Sep;34(9):5719-5731. doi: 10.1109/TNNLS.2021.3130896. Epub 2023 Sep 1.
Population-based optimization methods are widely used for hyperparameter (HP) tuning for a given specific task. In this work, we propose the population-based hyperparameter tuning with multitask collaboration (PHTMC), which is a general multitask collaborative framework with parallel and sequential phases for population-based HP tuning methods. In the parallel HP tuning phase, a shared population for all tasks is kept and the intertask relatedness is considered to both yield a better generalization ability and avoid data bias to a single task. In the sequential HP tuning phase, a surrogate model is built for each new-added task so that the metainformation from the existing tasks can be extracted and used to help the initialization for the new task. Experimental results show significant improvements in generalization abilities yielded by neural networks trained using the PHTMC and better performances achieved by multitask metalearning. Moreover, a visualization of the solution distribution and the autoencoder's reconstruction of both the PHTMC and a single-task population-based HP tuning method is compared to analyze the property with the multitask collaboration.
基于群体的优化方法被广泛用于针对给定特定任务的超参数(HP)调优。在这项工作中,我们提出了基于群体的多任务协作超参数调优(PHTMC),它是一个通用的多任务协作框架,具有用于基于群体的HP调优方法的并行和顺序阶段。在并行HP调优阶段,保留所有任务的共享群体,并考虑任务间的相关性,以产生更好的泛化能力并避免数据偏向单个任务。在顺序HP调优阶段,为每个新添加的任务构建一个替代模型,以便可以提取现有任务的元信息并用于帮助新任务的初始化。实验结果表明,使用PHTMC训练的神经网络在泛化能力上有显著提高,并且多任务元学习取得了更好的性能。此外,比较了PHTMC和基于单任务群体的HP调优方法的解分布可视化以及自动编码器的重构,以分析多任务协作的特性。