TEXT COMPRESSION ALGORITHMS - A COMPARATIVE STUDY

Data Compression may be defined as the science and art of the representation of information in a crisply condensed form. For decades, Data compression has been one of the critical enabling technologies for the ongoing digital multimedia revolution. There are a lot of data compression algorithms whic...

Full description

Saved in:
Bibliographic Details
Published in:ICTACT journal on communication technology Vol. 2; no. 4; pp. 444 - 451
Main Authors: S, Senthil, L, Robert
Format: Journal Article
Language:English
Published: ICT Academy of Tamil Nadu 01.12.2011
Subjects:
ISSN:0976-0091, 2229-6948
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Data Compression may be defined as the science and art of the representation of information in a crisply condensed form. For decades, Data compression has been one of the critical enabling technologies for the ongoing digital multimedia revolution. There are a lot of data compression algorithms which are available to compress files of different formats. This paper provides a survey of different basic lossless data compression algorithms. Experimental results and comparisons of the lossless compression algorithms using Statistical compression techniques and Dictionary based compression techniques were performed on text data. Among the Statistical coding techniques, the algorithms such as Shannon-Fano Coding, Huffman coding, Adaptive Huffman coding, Run Length Encoding and Arithmetic coding are considered. Lempel Ziv scheme which is a dictionary based technique is divided into two families: one derived from LZ77 (LZ77, LZSS, LZH, LZB and LZR) and the other derived from LZ78 (LZ78, LZW, LZFG, LZC and LZT). A set of interesting conclusions are derived on this basis.
ISSN:0976-0091
2229-6948
DOI:10.21917/ijct.2011.0062