Multiscale Underwater Image Enhancement in RGB and HSV Color Spaces

Clear underwater images serve to explore and measure ocean resources. However, underwater images suffer from color deviations and hazy effects due to wavelength-dependent light attenuation and scattering. To address this problem, we propose a multiscale dual-color space underwater image enhancement...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on instrumentation and measurement Vol. 72; pp. 1 - 14
Main Authors: Liu, Chufan, Shu, Xin, Pan, Lei, Shi, Jinlong, Han, Bin
Format: Journal Article
Language:English
Published: New York IEEE 2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:0018-9456, 1557-9662
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Clear underwater images serve to explore and measure ocean resources. However, underwater images suffer from color deviations and hazy effects due to wavelength-dependent light attenuation and scattering. To address this problem, we propose a multiscale dual-color space underwater image enhancement network (MSDC-Net) comprising a color correction block and a deep learning-based network. Concretely, the color correction block compensates for the most absorbed color and limits the least one. First, the color histogram distributions of all channels are shifted to a similar range, which smooths and generalizes color deviations to a certain extent. Second, the corrected multiscale images are fed into a deep learning-based asymmetric multiscale encoder-decoder architecture, which works in RGB and HSV color spaces to extract rich and varied features. Then, the extracted features are integrated through the selective kernel concatenation (SKC) module. Finally, the decoder produces competitive outputs from the integrated features. Extensive experiments on real-world and synthetic underwater images demonstrate that the proposed MSDC-Net achieves outstanding results in subjective visual comparisons and objective metrics.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2023.3298395