Sparse attention block: Aggregating contextual information for object detection
作者:
Highlights:
• This paper proposes a sparse attention block (SA block) to capture long-range dependencies in an efficient way.
• SA block introduces the additional cost of <2% GPU memory and computation of the conventional non-local block.
• SA block can be easily plugged into various object detection frameworks.
• SA block boosts the detection accuracy by more than 1% on COCO with slight computation and memory addition.
摘要
•This paper proposes a sparse attention block (SA block) to capture long-range dependencies in an efficient way.•SA block introduces the additional cost of <2% GPU memory and computation of the conventional non-local block.•SA block can be easily plugged into various object detection frameworks.•SA block boosts the detection accuracy by more than 1% on COCO with slight computation and memory addition.
论文关键词:Object detection,Self-attention,Convolution neural network
论文评审过程:Received 1 July 2020, Revised 15 May 2021, Accepted 1 November 2021, Available online 17 November 2021, Version of Record 28 February 2022.
论文官网地址:https://doi.org/10.1016/j.patcog.2021.108418