Multi-attention augmented network for single image super-resolution

作者:

Highlights:

• We present a deep MAAN for high-quality image SR, which can jointly learn the optimal representation of multi-scale, multi-orientation and multi-level features. With the multi-stage strategy, our MAAN can obtain state-of-the-art results.

• We propose a gated U-net structure to generate the content-aware features, in which the spatial attention is developed to capture the long-range dependencies in different resolution feature maps. To further enhance feature discrimination, we design the combination of two pre-defined sparse kernels and one standard kernel to extract multi-orientation features, which are fused via channel attention mechanism.

• We propose a novel self-attention mechanism to help in recovering the realistic details. This attention is designed to refine the final feature maps according to the feature interaction in the neighbouring positions.

摘要

•We present a deep MAAN for high-quality image SR, which can jointly learn the optimal representation of multi-scale, multi-orientation and multi-level features. With the multi-stage strategy, our MAAN can obtain state-of-the-art results.•We propose a gated U-net structure to generate the content-aware features, in which the spatial attention is developed to capture the long-range dependencies in different resolution feature maps. To further enhance feature discrimination, we design the combination of two pre-defined sparse kernels and one standard kernel to extract multi-orientation features, which are fused via channel attention mechanism.•We propose a novel self-attention mechanism to help in recovering the realistic details. This attention is designed to refine the final feature maps according to the feature interaction in the neighbouring positions.

论文关键词:Super-resolution,Multi-scale U-net,pre-defined sparse kernels,Attention mechanism

论文评审过程:Received 26 June 2020, Revised 10 May 2021, Accepted 21 September 2021, Available online 23 September 2021, Version of Record 1 October 2021.

论文官网地址:https://doi.org/10.1016/j.patcog.2021.108349