• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

MIGO-NAS:迈向快速且可泛化的神经架构搜索。

MIGO-NAS: Towards Fast and Generalizable Neural Architecture Search.

出版信息

IEEE Trans Pattern Anal Mach Intell. 2021 Sep;43(9):2936-2952. doi: 10.1109/TPAMI.2021.3065138. Epub 2021 Aug 4.

DOI:10.1109/TPAMI.2021.3065138
PMID:33710952
Abstract

Neural architecture search (NAS) has achieved unprecedented performance in various computer vision tasks. However, most existing NAS methods are defected in search efficiency and model generalizability. In this paper, we propose a novel NAS framework, termed MIGO-NAS, with the aim to guarantee the efficiency and generalizability in arbitrary search spaces. On the one hand, we formulate the search space as a multivariate probabilistic distribution, which is then optimized by a novel multivariate information-geometric optimization (MIGO). By approximating the distribution with a sampling, training, and testing pipeline, MIGO guarantees the memory efficiency, training efficiency, and search flexibility. Besides, MIGO is the first time to decrease the estimation error of natural gradient in multivariate distribution. On the other hand, for a set of specific constraints, the neural architectures are generated by a novel dynamic programming network generation (DPNG), which significantly reduces the training cost under various hardware environments. Experiments validate the advantages of our approach over existing methods by establishing a superior accuracy and efficiency i.e., 2.39 test error on CIFAR-10 benchmark and 21.7 on ImageNet benchmark, with only 1.5 GPU hours and 96 GPU hours for searching, respectively. Besides, the searched architectures can be well generalize to computer vision tasks including object detection and semantic segmentation, i.e., 25× FLOPs compression, with 6.4 mAP gain over Pascal VOC dataset, and 29.9× FLOPs compression, with only 1.41 percent performance drop over Cityscapes dataset. The code is publicly available.

摘要

神经结构搜索 (NAS) 在各种计算机视觉任务中取得了前所未有的性能。然而,大多数现有的 NAS 方法在搜索效率和模型泛化能力方面存在缺陷。在本文中,我们提出了一种新的 NAS 框架,称为 MIGO-NAS,旨在保证任意搜索空间中的效率和泛化能力。一方面,我们将搜索空间表示为多元概率分布,然后通过新的多元信息几何优化 (MIGO) 对其进行优化。通过使用抽样、训练和测试管道来逼近分布,MIGO 保证了内存效率、训练效率和搜索灵活性。此外,MIGO 是首次降低多元分布中自然梯度的估计误差。另一方面,对于一组特定的约束,通过新的动态规划网络生成 (DPNG) 生成神经结构,这显著降低了在各种硬件环境下的训练成本。实验通过在 CIFAR-10 基准测试上达到 2.39 的测试误差和在 ImageNet 基准测试上达到 21.7 的准确率,证明了我们的方法相对于现有方法的优势,搜索分别只需要 1.5 GPU 小时和 96 GPU 小时。此外,搜索到的架构可以很好地泛化到计算机视觉任务,包括目标检测和语义分割,即在 Pascal VOC 数据集上压缩 25 倍的 FLOPs 可以获得 6.4 的 mAP 增益,在 Cityscapes 数据集上压缩 29.9 倍的 FLOPs 可以获得仅 1.41 个百分点的性能下降。代码是公开的。

相似文献

1
MIGO-NAS: Towards Fast and Generalizable Neural Architecture Search.MIGO-NAS:迈向快速且可泛化的神经架构搜索。
IEEE Trans Pattern Anal Mach Intell. 2021 Sep;43(9):2936-2952. doi: 10.1109/TPAMI.2021.3065138. Epub 2021 Aug 4.
2
You Only Search Once: Single Shot Neural Architecture Search via Direct Sparse Optimization.你只需搜索一次:通过直接稀疏优化的单镜头神经架构搜索。
IEEE Trans Pattern Anal Mach Intell. 2021 Sep;43(9):2891-2904. doi: 10.1109/TPAMI.2020.3020300. Epub 2021 Aug 4.
3
RelativeNAS: Relative Neural Architecture Search via Slow-Fast Learning.相对NAS:通过快慢学习进行相对神经架构搜索
IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):475-489. doi: 10.1109/TNNLS.2021.3096658. Epub 2023 Jan 5.
4
ReCNAS: Resource-Constrained Neural Architecture Search Based on Differentiable Annealing and Dynamic Pruning.ReCNAS:基于可微退火和动态剪枝的资源受限神经网络架构搜索
IEEE Trans Neural Netw Learn Syst. 2024 Feb;35(2):2805-2819. doi: 10.1109/TNNLS.2022.3192169. Epub 2024 Feb 5.
5
FNA++: Fast Network Adaptation via Parameter Remapping and Architecture Search.FNA++:通过参数重映射和架构搜索实现快速网络自适应。
IEEE Trans Pattern Anal Mach Intell. 2021 Sep;43(9):2990-3004. doi: 10.1109/TPAMI.2020.3044416. Epub 2021 Aug 4.
6
One-Shot Neural Architecture Search: Maximising Diversity to Overcome Catastrophic Forgetting.单次神经架构搜索:通过最大化多样性克服灾难性遗忘。
IEEE Trans Pattern Anal Mach Intell. 2021 Sep;43(9):2921-2935. doi: 10.1109/TPAMI.2020.3035351. Epub 2021 Aug 4.
7
Deeply Supervised Block-Wise Neural Architecture Search.深度监督的逐块神经架构搜索
IEEE Trans Neural Netw Learn Syst. 2025 Feb;36(2):2451-2464. doi: 10.1109/TNNLS.2023.3347542. Epub 2025 Feb 6.
8
NAS-HRIS: Automatic Design and Architecture Search of Neural Network for Semantic Segmentation in Remote Sensing Images.NAS-HRIS:遥感图像语义分割中神经网络的自动设计与体系结构搜索。
Sensors (Basel). 2020 Sep 16;20(18):5292. doi: 10.3390/s20185292.
9
Partially-Connected Neural Architecture Search for Reduced Computational Redundancy.部分连接的神经架构搜索,以减少计算冗余。
IEEE Trans Pattern Anal Mach Intell. 2021 Sep;43(9):2953-2970. doi: 10.1109/TPAMI.2021.3059510. Epub 2021 Aug 4.
10
Block Proposal Neural Architecture Search.块提议神经架构搜索。
IEEE Trans Image Process. 2021;30:15-25. doi: 10.1109/TIP.2020.3028288. Epub 2020 Nov 18.

引用本文的文献

1
Toward general object search in open reality.迈向开放现实中的通用目标搜索。
Sci Rep. 2025 Apr 19;15(1):13523. doi: 10.1038/s41598-025-97251-5.