Self-Adaptive Image Thresholding within Nonextensive Entropy and the Variance of the Gray-Level Distribution

In order to automatically recognize different kinds of objects from their backgrounds, a self-adaptive segmentation algorithm that can effectively extract the targets from various surroundings is of great importance. Image thresholding is widely adopted in this field because of its simplicity and hi...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Entropy (Basel, Switzerland) Ročník 24; číslo 3; s. 319
Hlavní autori: Deng, Qingyu, Shi, Zeyi, Ou, Congjie
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Switzerland MDPI AG 23.02.2022
MDPI
Predmet:
ISSN:1099-4300, 1099-4300
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:In order to automatically recognize different kinds of objects from their backgrounds, a self-adaptive segmentation algorithm that can effectively extract the targets from various surroundings is of great importance. Image thresholding is widely adopted in this field because of its simplicity and high efficiency. The entropy-based and variance-based algorithms are two main kinds of image thresholding methods, and have been independently developed for different kinds of images over the years. In this paper, their advantages are combined and a new algorithm is proposed to deal with a more general scope of images, including the long-range correlations among the pixels that can be determined by a nonextensive parameter. In comparison with the other famous entropy-based and variance-based image thresholding algorithms, the new algorithm performs better in terms of correctness and robustness, as quantitatively demonstrated by four quality indices, ME, RAE, MHD, and PSNR. Furthermore, the whole process of the new algorithm has potential application in self-adaptive object recognition.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1099-4300
1099-4300
DOI:10.3390/e24030319