Blind atmospheric turbulence deconvolution

A new blind image deconvolution technique is developed for atmospheric turbulence deblurring to overcome limitations of ‘generic’ blind deconvolution algorithms that do not take into account the complicated physics of the turbulence. The originality of the proposed approach relies on an actual physi...

Full description

Saved in:
Bibliographic Details
Published in:IET image processing Vol. 14; no. 14; pp. 3422 - 3432
Main Authors: Deledalle, Charles-Alban, Gilles, Jérôme
Format: Journal Article
Language:English
Published: The Institution of Engineering and Technology 01.12.2020
Subjects:
ISSN:1751-9659, 1751-9667
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A new blind image deconvolution technique is developed for atmospheric turbulence deblurring to overcome limitations of ‘generic’ blind deconvolution algorithms that do not take into account the complicated physics of the turbulence. The originality of the proposed approach relies on an actual physical model, known as the Fried kernel, that quantifies the impact of the atmospheric turbulence on the optical resolution of images. While the original expression of the Fried kernel can seem cumbersome at first sight, the authors show that it can be reparameterised in a much simpler form. This simple expression allows to efficiently embed this kernel in the proposed blind atmospheric turbulence deconvolution (BATUD) algorithm. BATUD is an iterative algorithm that alternately performs deconvolution and estimates the Fried kernel by jointly relying on a Gaussian mixture model prior to natural image patches and controlling for the square Euclidean norm of the Fried kernel. Numerical experiments show that the proposed blind deconvolution algorithm behaves well in different simulated turbulence scenarios, as well as on real images. Not only BATUD outperforms state-of-the-art approaches used in atmospheric turbulence deconvolution in terms of image quality metrics but is also faster.
ISSN:1751-9659
1751-9667
DOI:10.1049/iet-ipr.2019.1442