Zhen Xingjian, Meng Zihang, Chakraborty Rudrasis, Singh Vikas
University of Wisconsin-Madison.
Butlr.
Comput Vis ECCV. 2022 Oct;13686:327-346. doi: 10.1007/978-3-031-19809-0_19. Epub 2022 Nov 1.
Comparing the functional behavior of neural network models, whether it is a single network over time or two (or more networks) during or post-training, is an essential step in understanding what they are learning (and what they are not), and for identifying strategies for regularization or efficiency improvements. Despite recent progress, e.g., comparing vision transformers to CNNs, systematic comparison of function, especially across different networks, remains difficult and is often carried out layer by layer. Approaches such as canonical correlation analysis (CCA) are applicable in principle, but have been sparingly used so far. In this paper, we revisit a (less widely known) from statistics, called distance correlation (and its partial variant), designed to evaluate correlation between feature spaces of different dimensions. We describe the steps necessary to carry out its deployment for large scale models - this opens the door to a surprising array of applications ranging from conditioning one deep model w.r.t. another, learning disentangled representations as well as optimizing diverse models that would directly be more robust to adversarial attacks. Our experiments suggest a versatile regularizer (or constraint) with many advantages, which avoids some of the common difficulties one faces in such analyses .
比较神经网络模型的功能行为,无论是单个网络随时间的变化,还是两个(或更多网络)在训练期间或训练后的行为,都是理解它们在学习什么(以及没有学习什么)以及确定正则化或效率提升策略的重要步骤。尽管最近取得了进展,例如将视觉Transformer与卷积神经网络进行比较,但功能的系统比较,尤其是跨不同网络的比较,仍然很困难,并且通常是逐层进行的。诸如典型相关分析(CCA)之类的方法原则上是适用的,但到目前为止使用较少。在本文中,我们重新审视了统计学中一种(不太广为人知)的方法,称为距离相关(及其部分变体),旨在评估不同维度特征空间之间的相关性。我们描述了将其部署到大规模模型所需的步骤——这为一系列令人惊讶的应用打开了大门,从根据另一个模型调整一个深度模型、学习解缠表示到优化对对抗攻击直接更具鲁棒性的各种模型。我们的实验表明,它是一种具有许多优点的通用正则化器(或约束),避免了此类分析中人们面临的一些常见困难。