Ant weight-lifting algorithm for motion estimation

Every video coding standard includes and requires motion estimation and compensation. The full search algorithm, which provides the best motion estimation, has a very high computation cost. Researchers have developed several algorithms to reduce the cost of computation. However, most of these algori...

Full description

Saved in:
Bibliographic Details
Published in:Iran Journal of Computer Science (Online) Vol. 6; no. 3; pp. 207 - 219
Main Authors: Acharjee, Suvojit, Chaudhuri, Sheli Sinha
Format: Journal Article
Language:English
Published: Cham Springer International Publishing 01.09.2023
Springer Nature B.V
Subjects:
ISSN:2520-8438, 2520-8446
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Every video coding standard includes and requires motion estimation and compensation. The full search algorithm, which provides the best motion estimation, has a very high computation cost. Researchers have developed several algorithms to reduce the cost of computation. However, most of these algorithms become trapped in local minima during the search. Population-based evolutionary algorithms are widely used to develop a computationally efficient and cost-effective motion estimation strategy. The most recent effort used the Jaya algorithm to develop a motion estimation process that outperformed the state-of-the-art test zone search algorithm. In this study, a motion estimation algorithm based on the ant weight-lifting approach is proposed. Previously, the ant weight-lifting algorithm was used to solve a variety of problems, such as image segmentation, signal compression, and so on. The ant weight-lifting algorithm's computation cost was reduced by adopting a fitness estimation method that uses nearest-neighbor interpolation and an early termination strategy. Compared to Jaya algorithm-based motion estimation, the proposed algorithm executes up to 3% more quickly and exhibits up to 1.2 dB less distortion.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2520-8438
2520-8446
DOI:10.1007/s42044-022-00134-5