TITLE

APCFS: Autonomous and Parallel Compressed File System

AUTHOR(S)
Kella, Kush; Khanum, Aasia
PUB. DATE
August 2011
SOURCE
International Journal of Parallel Programming;Aug2011, Vol. 39 Issue 4, p522
SOURCE TYPE
Academic Journal
DOC. TYPE
Article
ABSTRACT
PCFS (Autonomous and Parallel Compressed File System) is a file system that supports fast autonomous compression at high compression rates. It is designed as a virtual layer inserted over existing file system, compressing and decompressing data by intercepting kernel calls. The system achieves enhanced compression ratio by combining two compression techniques. Speed is attained by performing the two compression techniques in parallel. Experimental results indicate good performance.
ACCESSION #
60411672

 

Related Articles

  • Zipping.  // Network Dictionary;2007, p541 

    An encyclopedia entry for "zipping" is presented. It refers to the process of compressing a file so it takes up less space. The compressed files are called a zip file, which have two types, the normal and self-executing that open up automatically. It states that PKZip and WinZip are popular...

  • Compress Your Files. Branzburg, Jeffrey // Tech & Learning;Jan2005, Vol. 25 Issue 6, p32 

    This article presents tips on how to compress computer files. File compression allows users to squeeze data, greatly reducing the file size. By reducing file size users can send and receive files over the Internet more quickly, store more files on the hard drive, and pack many related files into...

  • Design and Analysis of an Effective Corpus for Evaluation of Bengali Text Compression Schemes. Islam, Rafiqul; Rajon, S. A. Ahsan // Journal of Computers;Jan2010, Vol. 5 Issue 1, p59 

    In this paper, we propose an effective platform for evaluation of Bengali text compression schemes. A novel scheme for construction of Bengali text compression corpus has also been incorporated in this paper. A methodical study on the formulation-approaches of text corpus for data compression...

  • Nona-Tree Weighted Finite Automata Compression of Simple Images. Lakhavijitlert, T.; Prachumrak, K. // Enformatika;2006, Vol. 14, p370 

    We propose Nona-tree Weighted Finite Automata (NWFA) as a new approach to compress simple images. NWFA is used as a tool to the lossy compression of simple images which mean, in this case, bi-level and simple gray-scale images. An image is divided into multi-level of 9 partitions (nona-tree...

  • Robust Data Compression: Consistency Checking in the Synchronization of Variable Length Codes. Perkins, S.; Smith, D. H.; Ryley, A. // Computer Journal;2004, Vol. 47 Issue 3, p309 

    Many data compression techniques are essentially variable length codes. Such codes suffer from a vulnerability to error propagation. This occurs when an error causes the correspondence between the data stream and the stream of encoded symbols to be lost and continues until synchronization...

  • ISSDC: DIGRAM CODING BASED LOSSLESS DATA COMPRESSION ALGORITHM. MESUT, Altan; CARUS, Aydin // Computing & Informatics;2010, Vol. 29 Issue 4, p741 

    In this paper, a new lossless data compression method that is based on digram coding is introduced. This data compression method uses semi-static dictionaries: All of the used characters and most frequently used two character blocks (digrams) in the source are found and inserted into a...

  • Robust Data Compression: Variable Length Codes and Burst Errors. PERKINS, S.; SMITH, D. H. // Computer Journal;May2005, Vol. 48 Issue 3, p315 

    When variable length codes are used in the presence of error, slippage can occur. The number of symbols decoded before synchronization is regained can differ from the number of encoded symbols, causing misinterpretation of data. In previous work certain synchronization schemes have been shown to...

  • A BIT-LEVEL TEXT COMPRESSION SCHEME BASED ON THE HCDC ALGORITHM. Al-Bahadili, H.; Rababa'a, A. // International Journal of Computers & Applications (Acta Press);2010, Vol. 32 Issue 3, p355 

    In this paper we proposed and evaluated the performance of a new bit-level text compression scheme that is based on the Hamming codes based data compression (HCDC) algorithm. The Scheme consists of six steps some of which are repetitively applied to achieve higher compression ratio. The...

  • Lossless Data Compression. Briddock, David // Micro Mart;1/1/2015, Issue 1343, p53 

    This article takes a look into the history of data compression algorithms. Topics discussed include how communication relies on lossless solution for the restoration of compressed data to exactly the same state as before, the role played by Claude Shannon and Robert Fanohad as well as Abraham...

Share

Read the Article

Courtesy of VIRGINIA BEACH PUBLIC LIBRARY AND SYSTEM

Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics