RGPNet: Learning a detail-preserved progressive model for single image deraining based on gradient prior

The issue of rain removal in images is challenging due to the diverse rain distributions and complex backgrounds. Although deraining capabilities have greatly improved in the previous single-stage approaches, background detail loss is a common result. To effectively remove rain that is diversly dist...

Full description

Saved in:
Bibliographic Details
Published in:Digital signal processing Vol. 165; p. 105312
Main Authors: Luo, Yu, Liang, Gaoquan, Ling, Jie, Zhou, Teng, Huang, Huiwu, Han, Tian
Format: Journal Article
Language:English
Published: Elsevier Inc 01.10.2025
Subjects:
ISSN:1051-2004
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The issue of rain removal in images is challenging due to the diverse rain distributions and complex backgrounds. Although deraining capabilities have greatly improved in the previous single-stage approaches, background detail loss is a common result. To effectively remove rain that is diversly distributed, many methods adopt a multistage rain removal framework for progressive deraining. Most of these multistage methods repeatedly use information from previous stages, or the original rainy image to refine the rain extracted from the background, leading to high demands on the ability to discriminate the background structure from the rain. To address this problem, we introduce a gradient prior to artificially preserve the background details. In addition, an attention LSTM is proposed to reduce the artifacts caused by over-deraining, by focusing on the more visible rain regions. The overall architecture of the proposed method consists of a rain streak extraction branch and a background detail recovery branch, with the designed attention LSTM and the proposed gradient prior integrated into the former and latter branches, respectively. Experiments on several well-known benchmark datasets show that our methods can outperform many state-of-the-art methods.
ISSN:1051-2004
DOI:10.1016/j.dsp.2025.105312