Is rateless paradigm fitted for lossless compression of erasure-impaired sources?

Silvija Kokalj-Filipović, Emina Soljanin, Predrag Spasojević

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Even though in many network scenarios erasure-impaired sources represent a more appropriate data model than frequently used Gaussian and binary sources, they only recently entered the scene of compression coding through introduction of binary erasure quantization over sparse graphs. Binary erasure quantization (BEQ) considers ternary sources (zeros, ones and erasures), and binary reconstructions where Hamming distortion is defined for non-erasure source symbols, and distortion is zero for any binary reconstruction of erasure symbols. We believe that constructive schemes for binary erasure quantization deserve more attention. We focus on the rate-optimal zero-distortion BEQ schemes, resulting in the complete recovery of both the unerased bits and the (positions of) erasures in the source sequence. We analyze suitability of rateless codes for this form of BEQ, in terms of compression and decompression complexity, and the rate-gap with respect to the theoretically optimal rate. We demonstrate how duals of properly designed fountain codes can be used for the erasure-rate adaptive lossless compression if the compression rate gap is traded off for complexity of encoding and decoding. We also show that there might exist a better trade-off if the dual code is doped with additional fake erasures, but the results are not conclusive. Finally, an important contribution is that we recognized and explained an idiosyncrasy of fountain codes when it comes to the construction of dual codes employed in quantization; namely, the common starting point is the iterative erasure decoding with LDPC codes, which is based on the parity check matrix, meaning that the same graph can be used for the quantization, as it represents the generator matrix of the dual code; this is not the case with LT codes whose decoder is based on the same graph as its encoder. Hence, the dualization needs to take a completely different track, as explained here.

Original languageEnglish (US)
Title of host publication2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011
Pages18-25
Number of pages8
DOIs
StatePublished - 2011
Externally publishedYes
Event2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011 - Monticello, IL, United States
Duration: Sep 28 2011Sep 30 2011

Publication series

Name2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011

Conference

Conference2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011
Country/TerritoryUnited States
CityMonticello, IL
Period9/28/119/30/11

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Control and Systems Engineering

Fingerprint

Dive into the research topics of 'Is rateless paradigm fitted for lossless compression of erasure-impaired sources?'. Together they form a unique fingerprint.

Cite this