Paper
27 June 2023 DRA-Net: densely residual attention based low-light image enhancement
Sami Ul Rehman, Ammar Hawbani, Xingfu Wang, Muhammad Hamza, Liang Zhao, Saeed H. Alsamhi, Majjed Al-Qatf
Author Affiliations +
Proceedings Volume 12705, Fourteenth International Conference on Graphics and Image Processing (ICGIP 2022); 1270522 (2023) https://doi.org/10.1117/12.2680479
Event: Fourteenth International Conference on Graphics and Image Processing (ICGIP 2022), 2022, Nanjing, China
Abstract
The visual quality of nighttime photographs diminishes greatly due to low contrast and high noise. We need a robust image enhancement methodology to improve such low-light images close to standard daylight images. Due to deteriorated conditions of uneven light and noise, this image enhancement problem becomes ill-posed. Our paper has proposed a Densely Residual Attention Network (DRANet), an end-to-end attention base densely residual network. The architecture of DRANet consists of sub-modules convolution block (CB) and densely residual feature - convolutional block attention module (DRF-CBAM). DFR-CBAM also has sub-modules, deep residual feature block (DRFB), and convolutional block attention module (CBAM). Using the most recent results from attention and deep residual-based convolution networks in number of computer vision problems, we have used DRFB to enhance the features in-depth by using its dense and residual skip connections. Similarly, features in both spatial and channel axis have been extracted by using a lightweight CBAM attention module. Contrast, luminosity, and noise of the enhanced images have been balanced by additionally implementing a color balancing function at the end of the proposed network. Furthermore, we have used a combination of LLab, LSSIM and LMAE loss functions to make the proposed network stable and recover both contextual and local details while training. MAE, PSNR, SSIM, MS-SSIM, FSIM, Cosine Similarity, and deltaE2000 have been used as referenced while NIQE as nonreferenced based image quality assessment (IQA) metric. Experiment results showed that our proposed methodology is very effective with higher referenced and lower non-referenced image IQA metrics values. Furthermore, the effectiveness of our method has also demonstrated by the visual and perceptual quality of enhanced images.
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Sami Ul Rehman, Ammar Hawbani, Xingfu Wang, Muhammad Hamza, Liang Zhao, Saeed H. Alsamhi, and Majjed Al-Qatf "DRA-Net: densely residual attention based low-light image enhancement", Proc. SPIE 12705, Fourteenth International Conference on Graphics and Image Processing (ICGIP 2022), 1270522 (27 June 2023); https://doi.org/10.1117/12.2680479
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image enhancement

Image quality

Convolution

Color

Light sources and illumination

Education and training

RGB color model

Back to Top