A Novel Defocus-blur Region Detection Approach Based on DCT Feature and PCNN Structure

The motion or out-of-focus effect in digital images is the main reason for the blurred regions in defocused-blurred images. It may adversely affect various image features such as texture, pixel, and region. Therefore, it is important to detect in-focused objects in defocused-blurred images after the...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access Vol. 11; p. 1
Main Authors: Basar, Sadia, Ali, Mushtaq, Waheed, Abdul, Ahmad, Muneer, Miraz, Mahdi H.
Format: Journal Article
Language:English
Published: Piscataway IEEE 01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:2169-3536, 2169-3536
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The motion or out-of-focus effect in digital images is the main reason for the blurred regions in defocused-blurred images. It may adversely affect various image features such as texture, pixel, and region. Therefore, it is important to detect in-focused objects in defocused-blurred images after the segmentation of blurred and non-blurred regions. The state-of-the-art techniques are prone to noisy pixels, and their local descriptors for developing segmentation metrics are also complex. To address these issues, this research, therefore, proposed a novel and hybrid focused detection approach based on Discrete Cosine Transform (DCT) coefficients and PC Neural Net (PCNN) structure. The proposed approach partially resolves the limitations of the existing contrast schemes to detect focused smooth objects from the out-of-focused smooth regions in the defocus dataset. The visual and quantitative evaluation illustrates that the proposed approach outperformed in terms of accuracy and efficiency to referenced algorithms. The highest F α -score of the proposed approach on Zhao's dataset is 0.7940 whereas on Shi's dataset is 0.9178.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2023.3309820