IEEE Trans Pattern Anal Mach Intell. 2023 Jul;45(7):8936-8953. doi: 10.1109/TPAMI.2022.3226777. Epub 2023 Jun 5.
Searching for a more compact network width recently serves as an effective way of channel pruning for the deployment of convolutional neural networks (CNNs) under hardware constraints. To fulfil the searching, a one-shot supernet is usually leveraged to efficiently evaluate the performance w.r.t. different network widths. However, current methods mainly follow a unilaterally augmented (UA) principle for the evaluation of each width, which induces the training unfairness of channels in supernet. In this article, we introduce a new supernet called Bilaterally Coupled Network (BCNet) to address this issue. In BCNet, each channel is fairly trained and responsible for the same amount of network widths, thus each network width can be evaluated more accurately. Besides, we propose to reduce the redundant search space and present the BCNetV2 as the enhanced supernet to ensure rigorous training fairness over channels. Furthermore, we leverage a stochastic complementary strategy for training the BCNet, and propose a prior initial population sampling method to boost the performance of the evolutionary search. We also propose a new open-source width search benchmark on macro structures named Channel-Bench-Macro for the better comparisons of the width search algorithms with MobileNet- and ResNet-like architectures. Extensive experiments on the benchmark datasets demonstrate that our method can achieve state-of-the-art performance.
最近,寻找更紧凑的网络宽度已成为在硬件约束下部署卷积神经网络 (CNN) 的一种有效通道剪枝方法。为了实现搜索,通常利用一次性超网来有效地评估不同网络宽度的性能。然而,当前的方法主要遵循单边增强 (UA) 原则来评估每个宽度,这会导致超网中通道的训练不公平。在本文中,我们引入了一种名为双边耦合网络 (BCNet) 的新超网来解决这个问题。在 BCNet 中,每个通道都得到公平的训练,并负责相同数量的网络宽度,因此可以更准确地评估每个网络宽度。此外,我们提出了一种减少冗余搜索空间的方法,并提出了 BCNetV2 作为增强的超网,以确保通道之间的训练公平性。此外,我们利用随机互补策略来训练 BCNet,并提出了一种先验初始种群抽样方法来提高进化搜索的性能。我们还提出了一个名为 Channel-Bench-Macro 的用于宏结构的新的开源宽度搜索基准,以更好地比较具有 MobileNet 和 ResNet 类似架构的宽度搜索算法。在基准数据集上的广泛实验表明,我们的方法可以达到最先进的性能。