DSG-Fusion: Infrared and visible image fusion via generative adversarial networks and guided filter

作者:

Highlights:

• A novel GAN based image fusion method is proposed (DSG-Fusion). By integrating the guided filter into an end-to-end deep learning framework, the DSGFusion can not only extract more background and detail information, but also avoid the manual design of complex fusion rules.

• Features of two source images are extracted by two independent data flows, and their base layers and detail layers participate in feature extraction and fusion process respectively. The double-stream architecture of the designed network can extract more representative information from source images.

• The DSG-Fusion is compared with five existing methods on two public datasets. The extensive experimental results illustrate that DSG-Fusion preserves more intensity and texture information from source images, and the background and details are clearer.

摘要

•A novel GAN based image fusion method is proposed (DSG-Fusion). By integrating the guided filter into an end-to-end deep learning framework, the DSGFusion can not only extract more background and detail information, but also avoid the manual design of complex fusion rules.•Features of two source images are extracted by two independent data flows, and their base layers and detail layers participate in feature extraction and fusion process respectively. The double-stream architecture of the designed network can extract more representative information from source images.•The DSG-Fusion is compared with five existing methods on two public datasets. The extensive experimental results illustrate that DSG-Fusion preserves more intensity and texture information from source images, and the background and details are clearer.

论文关键词:Infrared and visible image fusion,Guided filter,Generative adversarial networks

论文评审过程:Received 15 April 2021, Revised 21 September 2021, Accepted 12 March 2022, Available online 22 March 2022, Version of Record 1 April 2022.

论文官网地址:https://doi.org/10.1016/j.eswa.2022.116905