TITLE

OPTIMIZATION OF LZW (LEMPEL-ZIV-WELCH) ALGORITHM TO REDUCE TIME COMPLEXITY FOR DICTIONARY CREATION IN ENCODING AND DECODING

AUTHOR(S)
Nishad, P. M.; Chezian, R. Manicka
PUB. DATE
May 2012
SOURCE
Asian Journal of Computer Science & Information Technology;May2012, Vol. 2 Issue 5, p114
SOURCE TYPE
Academic Journal
DOC. TYPE
Article
ABSTRACT
LZW (Lempel-Ziv-Welch) is a universal lossless data compression algorithm, which takes linear time in encoding and decoding. This paper discusses a methodology to reduce time complexity by combining binary search with LZW. The existing LZW compression algorithm takes large time for dictionary creation while encoding and decoding. By using binary search with LZW the time complexity can be reduced optimally and gradually because the comparison ratio is less while creating the dictionary. Especially while compressing the search for patterns in the table and also in the decompression algorithm for finding the pattern in the table is taking linear time for searching. Therefore, combining Enhanced LZW with binary search reduces the time complexity. The proposed methodology may be bust in data compression for communication, Maximizes the reduction of complexity in pattern identification for compression. The proposed methodology reduces the complexity in time with Binary search tree (BST). The experimental result shows 94.21 % improvement on Compression and 93.34% improvement on Decompression.
ACCESSION #
88032884

 

Related Articles

  • Multi Subgroup Data Compression Technique Using Switch Code. Daheriya, Rajesh; Kumar, Sushil; Chaudhari, Vijay; Verma, Bhupendra // International Journal on Computer Science & Engineering;2010, Vol. 2 Issue 3, p870 

    Data compression is the art of converting a data stream into a small in size data bits so that it can be easily travel a long distance without increasing load of its volume on a constant Bandwidth channel regardless of its increase volume. Data compression is essential techniques since last[1]...

  • Compression and Filtering of Random Signals under Constraint of Variable Memory. Torokhti, Anatoli; Miklavcic, Stan // Proceedings of World Academy of Science: Engineering & Technolog;Jun2008, Vol. 42, p768 

    We study a new technique for optimal data compression subject to conditions of causality and different types of memory. The technique is based on the assumption that some information about compressed data can be obtained from a solution of the associated problem without constraints of causality...

  • A Novel Real Time Algorithm for Remote Sensing Lossless Data Compression based on Enhanced DPCM. Ghamisi, P.; Mohammadzadeh, A.; Sahebi, M. R.; Sepehrband, F.; Choupan, J. // International Journal of Computer Applications;Aug2011, Vol. 27, p47 

    In this paper, simplicity of prediction models for image transformation is used to introduce a low complex and efficient lossless compression method for LiDAR rasterized data and RS grayscale images based on improving the energy compaction ability of prediction models. Further, proposed method...

  • How to Judge a Columnar Database. Raab, David // DM Review;Dec2007, Vol. 17 Issue 12, p33 

    The article focuses on evaluating columnar databases. First introduced in the 1970s, these databases are organized by column rather than row, wherein all instances of a single data element are stored together so they can be accessed as a unit, making them particularly efficient at analytical...

  • SAMPLIFY'S DATA COMPRESSION A BLESSING FOR STORAGE.  // eWeek;4/21/2008, Vol. 25 Issue 13, p52 

    The article features the view of Chris Preimesberger regarding Samplify's data compression in the U.S. He believes that such a scheme solves bottlenecks and take on data movement and storage for real-world. It uses a patented, high-end but low-complex compression algorithm to compress certain...

  • Data distribution schemes of sparse arrays on distributed memory multicomputers. Lin, Chun-Yuan; Chung, Yeh-Ching // Journal of Supercomputing;Jul2007, Vol. 41 Issue 1, p63 

    A data distribution scheme of sparse arrays on a distributed memory multicomputer, in general, is composed of three phases, data partition, data distribution, and data compression. To implement the data distribution scheme, many methods proposed in the literature first perform the data partition...

  • Efficient Data Compression Scheme using Dynamic Huffman Code Applied on Arabic Language. Ghwanmeh, Sameh; Al-Shalabi, Riyad; Kanaan, Ghassan // Journal of Computer Science;2006, Vol. 2 Issue 12, p887 

    The development of an efficient compression scheme to process the Arabic language represents a difficult task. This paper employs the dynamic Huffman coding on data compression with variable length bit coding, on the Arabic language. Experimental tests have been performed on both Arabic and...

  • Code size reduction by compressing repeated instruction sequences. Wang, Shao-Yang; Chang, Rong-Guey // Journal of Supercomputing;Jun2007, Vol. 40 Issue 3, p319 

    This paper presents an efficient technique for code compression. In our work, a sequence of instructions that occurs repeatedly in an application will be compressed to reduce its code size. During compression, each instruction is first divided into the operation part and the register part, and...

  • Interactive Compression of Digital Data. Carpentieri, Bruno // Algorithms;Mar2010, Vol. 3 Issue 1, p63 

    If we can use previous knowledge of the source (or the knowledge of a source that is correlated to the one we want to compress) to exploit the compression process then we can have significant gains in compression. By doing this in the fundamental source coding theorem we can substitute entropy...

Share

Read the Article

Courtesy of VIRGINIA BEACH PUBLIC LIBRARY AND SYSTEM

Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics