Exploring more diverse network architectures for single image super-resolution

作者:

Highlights:

摘要

We propose a plug-and-play neural architecture search (NAS) method to explore diverse architectures for single image super-resolution (SISR). Unlike current NAS-based methods with the single path setting and pipeline setting, our proposed method achieves the trade-off between diverse network architectures and search cost. Our proposed method formulates the task in a differentiable manner, which inherits the architecture parameter optimization method from Discrete Stochastic Neural Architecture Search (DSNAS). Besides the straightforward searching of operations, we also search each node in a cell for the activation function, from-node, and skip-connection node, which diverse the searched architecture topologies. The individually searching of skip-connection node avoids skip-connection excessive phenomenon. Moreover, to alleviate the influence of inconsistent architecture between training and testing periods, we introduce random variables into the architecture parameter as regularization. Benchmark experiments show our state-of-the-art performance under specific parameters and FLOPs constraints. Compared with other NAS-based SISR methods, our proposed methods achieve better performance with less searching time and resources. The superior results further demonstrate the effectiveness of our proposed NAS methods.

论文关键词:Neural architecture search,Single image super-resolution,Randomly smoothed regularization

论文评审过程:Received 20 April 2021, Revised 22 August 2021, Accepted 29 September 2021, Available online 26 October 2021, Version of Record 1 November 2021.

论文官网地址:https://doi.org/10.1016/j.knosys.2021.107648