Study on fast fractal image compression algorithm based on centroid radius

To achieve real-time transmission of image information under limited network bandwidth conditions, we propose a fast fractal image compression algorithm based on centroid radius. This algorithm addresses the shortcomings of conventional fractal encoding algorithms, such as high computational complex...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Systems science & control engineering Jg. 12; H. 1
Hauptverfasser: Li, Li-Feng, Hua, Yang, Liu, Yan-Hong, Huang, Feng-Hua
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Macclesfield Taylor & Francis 31.12.2024
Taylor & Francis Ltd
Taylor & Francis Group
Schlagworte:
ISSN:2164-2583, 2164-2583
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To achieve real-time transmission of image information under limited network bandwidth conditions, we propose a fast fractal image compression algorithm based on centroid radius. This algorithm addresses the shortcomings of conventional fractal encoding algorithms, such as high computational complexity and long encoding times. With our proposed algorithm, we calculate the centroid radius for both domain and range blocks, then sort the domain blocks based on the centroid radius. For a range block, we can find the best-matched domain blocks in the nearest neighbourhood. Additionally, we apply a bilinear interpolation algorithm to reconstruct the image's edges, reducing the block effect. In this paper, we use scalars to characterize image block features and optimize the codebook organization structure and matching method accordingly. This localization of the matching search range results in shorter coding times. Experimental results demonstrate that our encoder is 4.68 times faster than conventional fractal encoding with the proposed scheme, while still achieving good fidelity and compression ratios for the decoded image.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2164-2583
2164-2583
DOI:10.1080/21642583.2023.2269183