The Institute of Statistical Mathematics, 10-3 Midori-cho, Tachikawa, Tokyo 190-0014, Japan.
Kyushu Institute of Technology, 1-1 Sensui-cho, Tobata-ku, Fukuoka, 804-8550, Japan.
Neural Netw. 2023 Sep;166:446-458. doi: 10.1016/j.neunet.2023.07.011. Epub 2023 Jul 26.
Neural architecture search (NAS) is a framework for automating the design process of a neural network structure. While the recent one-shot approaches have reduced the search cost, there still exists an inherent trade-off between cost and performance. It is important to appropriately stop the search and further reduce the high cost of NAS. Meanwhile, the differentiable architecture search (DARTS), a typical one-shot approach, is known to suffer from overfitting. Heuristic early-stopping strategies have been proposed to overcome such performance degradation. In this paper, we propose a more versatile and principled early-stopping criterion on the basis of the evaluation of a gap between expectation values of generalisation errors of the previous and current search steps with respect to the architecture parameters. The stopping threshold is automatically determined at each search epoch without cost. In numerical experiments, we demonstrate the effectiveness of the proposed method. We stop the one-shot NAS algorithms and evaluate the acquired architectures on the benchmark datasets: NAS-Bench-201 and NATS-Bench. Our algorithm is shown to reduce the cost of the search process while maintaining a high performance.
神经结构搜索 (NAS) 是一种自动化神经网络结构设计过程的框架。虽然最近的单次搜索方法降低了搜索成本,但成本和性能之间仍然存在固有的权衡。适当停止搜索并进一步降低 NAS 的高成本非常重要。同时,有向无环图 (DARTS) 是一种典型的单次搜索方法,已知存在过拟合问题。启发式提前停止策略已被提出以克服这种性能下降。在本文中,我们提出了一种更通用和有原则的提前停止准则,该准则基于对架构参数的前向和当前搜索步骤的泛化误差的期望之间的差距的评估。在每个搜索阶段,无需成本即可自动确定停止阈值。在数值实验中,我们展示了所提出方法的有效性。我们停止单次搜索算法,并在基准数据集 NAS-Bench-201 和 NATS-Bench 上评估所获得的架构。结果表明,我们的算法在降低搜索过程成本的同时保持了高性能。