Improving One-Shot NAS with Shrinking-and-Expanding Supernet

作者:

Highlights:

• A novel search space shrinking method by designing powerful evaluation criterion of child models in weight sharing setting.

• A new method of decoupling supernet parameters by appropriately expanding parameter of supernet for reducing degree of weight shar ing.

• More stable and accurate performance ranking of candidate architectures based our proposed supernet.

摘要

•A novel search space shrinking method by designing powerful evaluation criterion of child models in weight sharing setting.•A new method of decoupling supernet parameters by appropriately expanding parameter of supernet for reducing degree of weight shar ing.•More stable and accurate performance ranking of candidate architectures based our proposed supernet.

论文关键词:Neural architecture search,Supernet,Search space shrinking

论文评审过程:Received 12 July 2020, Revised 29 April 2021, Accepted 1 May 2021, Available online 15 May 2021, Version of Record 5 June 2021.

论文官网地址:https://doi.org/10.1016/j.patcog.2021.108025