IEEE Trans Pattern Anal Mach Intell. 2023 Mar;45(3):3154-3168. doi: 10.1109/TPAMI.2022.3186715. Epub 2023 Feb 3.
We show that pre-trained Generative Adversarial Networks (GANs) such as StyleGAN and BigGAN can be used as a latent bank to improve the performance of image super-resolution. While most existing perceptual-oriented approaches attempt to generate realistic outputs through learning with adversarial loss, our method, Generative LatEnt bANk (GLEAN), goes beyond existing practices by directly leveraging rich and diverse priors encapsulated in a pre-trained GAN. But unlike prevalent GAN inversion methods that require expensive image-specific optimization at runtime, our approach only needs a single forward pass for restoration. GLEAN can be easily incorporated in a simple encoder-bank-decoder architecture with multi-resolution skip connections. Employing priors from different generative models allows GLEAN to be applied to diverse categories (e.g., human faces, cats, buildings, and cars). We further present a lightweight version of GLEAN, named LightGLEAN, which retains only the critical components in GLEAN. Notably, LightGLEAN consists of only 21% of parameters and 35% of FLOPs while achieving comparable image quality. We extend our method to different tasks including image colorization and blind image restoration, and extensive experiments show that our proposed models perform favorably in comparison to existing methods. Codes and models are available at https://github.com/open-mmlab/mmediting.
我们证明,像 StyleGAN 和 BigGAN 这样的预训练生成对抗网络 (GAN) 可以用作潜在库来提高图像超分辨率的性能。虽然大多数现有的基于感知的方法试图通过对抗性损失学习生成逼真的输出,但我们的方法,Generative LatEnt bANk (GLEAN),超越了现有的实践,直接利用预训练 GAN 中封装的丰富多样的先验知识。但与需要在运行时进行昂贵的特定图像优化的流行 GAN 反转方法不同,我们的方法只需要一次正向传递即可进行恢复。 GLEAN 可以轻松地与具有多分辨率跳过连接的简单编码器-库-解码器架构结合使用。从不同生成模型中利用先验知识,允许 GLEAN 应用于不同的类别(例如,人脸、猫、建筑物和汽车)。我们进一步提出了 GLEAN 的轻量级版本,称为 LightGLEAN,它仅保留 GLEAN 中的关键组件。值得注意的是,LightGLEAN 的参数仅为 21%,FLOPs 仅为 35%,而图像质量相当。我们将我们的方法扩展到不同的任务,包括图像着色和盲图像恢复,广泛的实验表明,与现有方法相比,我们提出的模型表现良好。代码和模型可在 https://github.com/open-mmlab/mmediting 上获得。