Two-scale decomposition and deep learning fusion for visible and infrared images
Abstract
The paper focuses on the fusion of visible and infrared images to generate composite images that preserve both the thermal radiation information from the infrared spectrum and the detailed texture from the visible spectrum. The proposed approach combines traditional methods, such as two-scale decomposition, with deep learning techniques, specifically employing an autoencoder architecture. The source images are subjected to two-scale decomposition, which extracts high-frequency detail and low-frequency base information. Additionally, an algorithmic unravelling technique establishes a logical connection between deep neural networks and traditional signal processing algorithms. The model consists of two encoders for decomposition and a decoder after the unravelling operation. During testing, a fusion layer merges the decomposed feature maps, and the decoder generates the fused image. Evaluation metrics including entropy, average gradient, spatial frequency and standard deviation are employed to subjectively assess fusion quality. The proposed approach demonstrates promise for effectively combining visible and infrared imagery for various applications.
Keywords
Near-infrared image; Visible image; Two-scale decomposition; Algorithm unravelling
Full Text:
PDFDOI: http://doi.org/10.11591/ijece.v15i2.pp1593-1601
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).