Study on fast fractal image compression algorithm based on centroid radius

To achieve real-time transmission of image information under limited network bandwidth conditions, we propose a fast fractal image compression algorithm based on centroid radius. This algorithm addresses the shortcomings of conventional fractal encoding algorithms, such as high computational complex...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Systems science & control engineering Ročník 12; číslo 1
Hlavní autori: Li, Li-Feng, Hua, Yang, Liu, Yan-Hong, Huang, Feng-Hua
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Macclesfield Taylor & Francis 31.12.2024
Taylor & Francis Ltd
Taylor & Francis Group
Predmet:
ISSN:2164-2583, 2164-2583
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:To achieve real-time transmission of image information under limited network bandwidth conditions, we propose a fast fractal image compression algorithm based on centroid radius. This algorithm addresses the shortcomings of conventional fractal encoding algorithms, such as high computational complexity and long encoding times. With our proposed algorithm, we calculate the centroid radius for both domain and range blocks, then sort the domain blocks based on the centroid radius. For a range block, we can find the best-matched domain blocks in the nearest neighbourhood. Additionally, we apply a bilinear interpolation algorithm to reconstruct the image's edges, reducing the block effect. In this paper, we use scalars to characterize image block features and optimize the codebook organization structure and matching method accordingly. This localization of the matching search range results in shorter coding times. Experimental results demonstrate that our encoder is 4.68 times faster than conventional fractal encoding with the proposed scheme, while still achieving good fidelity and compression ratios for the decoded image.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2164-2583
2164-2583
DOI:10.1080/21642583.2023.2269183