AIFNet: All-in-Focus Image Restoration Network Using a Light Field-Based Dataset

Defocus blur often degrades the performance of image understanding, such as object recognition and image segmentation. Restoring an all-in-focus image from its defocused version is highly beneficial to visual information processing and many photographic applications, despite being a severely ill-pos...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on computational imaging Ročník 7; s. 675 - 688
Hlavní autoři: Ruan, Lingyan, Chen, Bin, Li, Jizhou, Lam, Miu-Ling
Médium: Journal Article
Jazyk:angličtina
Vydáno: Piscataway IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:2573-0436, 2333-9403
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Defocus blur often degrades the performance of image understanding, such as object recognition and image segmentation. Restoring an all-in-focus image from its defocused version is highly beneficial to visual information processing and many photographic applications, despite being a severely ill-posed problem. We propose a novel convolutional neural network architecture AIFNet for removing spatially-varying defocus blur from a single defocused image. We leverage light field synthetic aperture and refocusing techniques to generate a large set of realistic defocused and all-in-focus image pairs depicting a variety of natural scenes for network training. AIFNet consists of three modules: defocus map estimation, deblurring and domain adaptation. The effects and performance of various network components are extensively evaluated. We also compare our method with existing solutions using several publicly available datasets. Quantitative and qualitative evaluations demonstrate that AIFNet shows the state-of-the-art performance.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2573-0436
2333-9403
DOI:10.1109/TCI.2021.3092891