Fan Qingnan, Chen Dongdong, Yuan Lu, Hua Gang, Yu Nenghai, Chen Baoquan
IEEE Trans Pattern Anal Mach Intell. 2021 Jan;43(1):33-47. doi: 10.1109/TPAMI.2019.2925793. Epub 2020 Dec 4.
Many different deep networks have been used to approximate, accelerate or improve traditional image operators. Among these traditional operators, many contain parameters which need to be tweaked to obtain the satisfactory results, which we refer to as "parameterized image operators". However, most existing deep networks trained for these operators are only designed for one specific parameter configuration, which does not meet the needs of real scenarios that usually require flexible parameters settings. To overcome this limitation, we propose a new decoupled learning algorithm to learn from the operator parameters to dynamically adjust the weights of a deep network for image operators, denoted as the base network. The learned algorithm is formed as another network, namely the weight learning network, which can be end-to-end jointly trained with the base network. Experiments demonstrate that the proposed framework can be successfully applied to many traditional parameterized image operators. To accelerate the parameter tuning for practical scenarios, the proposed framework can be further extended to dynamically change the weights of only one single layer of the base network while sharing most computation cost. We demonstrate that this cheap parameter-tuning extension of the proposed decoupled learning framework even outperforms the state-of-the-art alternative approaches.
许多不同的深度网络已被用于逼近、加速或改进传统图像算子。在这些传统算子中,许多都包含需要调整的参数以获得满意的结果,我们将其称为“参数化图像算子”。然而,大多数为这些算子训练的现有深度网络仅针对一种特定的参数配置进行设计,这无法满足实际场景中通常需要灵活参数设置的需求。为克服这一限制,我们提出一种新的解耦学习算法,从算子参数进行学习,以动态调整用于图像算子的深度网络(称为基础网络)的权重。所提出的算法构成另一个网络,即权重学习网络,它可以与基础网络进行端到端联合训练。实验表明,所提出的框架能够成功应用于许多传统参数化图像算子。为加速实际场景中的参数调整,所提出的框架可以进一步扩展为仅动态改变基础网络的单个层的权重,同时共享大部分计算成本。我们证明,所提出的解耦学习框架的这种低成本参数调整扩展甚至优于当前最先进的替代方法。