DPF-Net: Physical imaging model embedded data-driven underwater image enhancement
Saved in:
| Title: | DPF-Net: Physical imaging model embedded data-driven underwater image enhancement |
|---|---|
| Authors: | Han Mei, Kunqian Li, Shuaixin Liu, Chengzhi Ma, Qianli Jiang |
| Source: | ISPRS Journal of Photogrammetry and Remote Sensing. 228:679-693 |
| Publication Status: | Preprint |
| Publisher Information: | Elsevier BV, 2025. |
| Publication Year: | 2025 |
| Subject Terms: | FOS: Computer and information sciences, Computer Vision and Pattern Recognition (cs.CV), Image and Video Processing (eess.IV), Computer Science - Computer Vision and Pattern Recognition, FOS: Electrical engineering, electronic engineering, information engineering, Electrical Engineering and Systems Science - Image and Video Processing |
| Description: | Due to the complex interplay of light absorption and scattering in the underwater environment, underwater images experience significant degradation. This research presents a two-stage underwater image enhancement network called the Data-Driven and Physical Parameters Fusion Network (DPF-Net), which harnesses the robustness of physical imaging models alongside the generality and efficiency of data-driven methods. We first train a physical parameter estimate module using synthetic datasets to guarantee the trustworthiness of the physical parameters, rather than solely learning the fitting relationship between raw and reference images by the application of the imaging equation, as is common in prior studies. This module is subsequently trained in conjunction with an enhancement network, where the estimated physical parameters are integrated into a data-driven model within the embedding space. To maintain the uniformity of the restoration process amid underwater imaging degradation, we propose a physics-based degradation consistency loss. Additionally, we suggest an innovative weak reference loss term utilizing the entire dataset, which alleviates our model's reliance on the quality of individual reference images. Our proposed DPF-Net demonstrates superior performance compared to other benchmark methods across multiple test sets, achieving state-of-the-art results. The source code and pre-trained models are available on the project home page: https://github.com/OUCVisionGroup/DPF-Net. |
| Document Type: | Article |
| Language: | English |
| ISSN: | 0924-2716 |
| DOI: | 10.1016/j.isprsjprs.2025.07.031 |
| DOI: | 10.48550/arxiv.2503.12470 |
| Access URL: | http://arxiv.org/abs/2503.12470 |
| Rights: | Elsevier TDM arXiv Non-Exclusive Distribution |
| Accession Number: | edsair.doi.dedup.....c755e47bda57cf1f950d9a8d3c8992d4 |
| Database: | OpenAIRE |
| Abstract: | Due to the complex interplay of light absorption and scattering in the underwater environment, underwater images experience significant degradation. This research presents a two-stage underwater image enhancement network called the Data-Driven and Physical Parameters Fusion Network (DPF-Net), which harnesses the robustness of physical imaging models alongside the generality and efficiency of data-driven methods. We first train a physical parameter estimate module using synthetic datasets to guarantee the trustworthiness of the physical parameters, rather than solely learning the fitting relationship between raw and reference images by the application of the imaging equation, as is common in prior studies. This module is subsequently trained in conjunction with an enhancement network, where the estimated physical parameters are integrated into a data-driven model within the embedding space. To maintain the uniformity of the restoration process amid underwater imaging degradation, we propose a physics-based degradation consistency loss. Additionally, we suggest an innovative weak reference loss term utilizing the entire dataset, which alleviates our model's reliance on the quality of individual reference images. Our proposed DPF-Net demonstrates superior performance compared to other benchmark methods across multiple test sets, achieving state-of-the-art results. The source code and pre-trained models are available on the project home page: https://github.com/OUCVisionGroup/DPF-Net. |
|---|---|
| ISSN: | 09242716 |
| DOI: | 10.1016/j.isprsjprs.2025.07.031 |
Full Text Finder
Nájsť tento článok vo Web of Science