The Wavelet Digest Homepage
Return to the homepage
Search the complete Wavelet Digest database
Help about the Wavelet Digest mailing list
About the Wavelet Digest
The Digest The Community
 Latest Issue  Back Issues  Events  Gallery
The Wavelet Digest
   -> Volume 6, Issue 10


Preprint: Preprints on wavelet-based image compression
 
images/spacer.gifimages/spacer.gif Reply into Digest
Previous :: Next  
Author Message
Detlev Marpe (d.marpe@fhtw-berlin.de)
Guest





PostPosted: Tue Sep 23, 1997 11:25 am    
Subject: Preprint: Preprints on wavelet-based image compression
Reply with quote

#6 Preprint: Preprints on wavelet-based image compression

We would like to announce the availability of the following 3 papers on
wavelet-based image compression. You can download copies either from
our web-site

http://www.fhtw-berlin.de/Projekte/Wavelet

or via ftp from the URLs given below.

1) Title: Energy Constraint Scarce Wavelet Packet Libraries
for Image Compression

Authors: Detlev Marpe, Hans L. Cycon, and Wu Li

URL: ftp://ftp.rz.fhtw-berlin.de/pub/fhtw/fb3/pp4.ps.gz

Status: Preprint No. 541, Dept. of Mathematics,
Technical University Berlin, 1996.

Abstract: In this paper we introduce a generalization of the best basis
wavelet packet method which selects a suboptimal best basis out of a
restricted library of wavelet packet bases. For a given image this
restricted library is generated by a 2D separable multiresolution
analysis which discriminates branches of the resulting quadtree
structured representation of the image according to a measure of
energy-compaction. Thus we get a family of scarce libraries
(w.r.t. to the full wavelet packet library) parameterized by a
so-called energy-threshold. This energy-threshold parameter allows a
truncation of energetically irrelevant branches and hence a control
of the complexity of the best-basis algorithm ranging from the low
complexity standard DWT to the high complexity full best-basis
algorithm. Our experimental results for image coding applications
show that the rate-distortion (RD) performance increases in
distinguished (image dependent) jumps as the energy-threshold
decreases. As a consequence, the instrument of energy-thresholding
can be used to find optima with relatively high RD performance and
relatively low complexity.

2) Title: A Complexity Constraint Best-Basis Wavelet Packet Algorithm
for Image Compression

Authors: Detlev Marpe, Hans L. Cycon, and Wu Li

URL: ftp://ftp.rz.fhtw-berlin.de/pub/fhtw/fb3/ccbb6.ps.gz

Status: submitted to Applied and Computational Harmonic Analysis,
Academic Press, 1997.

Abstract: The concept of adapted waveform analysis using a best basis selection
out of a predefined library of wavelet packet (WP) bases allows an
efficient representation of a signal. These methods usually have the
disadvantage of high computational complexity. In this paper we
introduce an extension of the best-basis method, the complexity
constrained best-basis algorithm (CCBB) which allows an adaptive
approach, has relatively low complexity and is memory saving. Our
CCBB algorithm generates iteratively a scarce library of WP bases by
extending a given library according to the energy distribution of the
WP representation of the signal. This iteration process is terminated
by using a complexity measure as a control parameter. Our
experimental results for image coding applications show that the
rate-distortion (RD) performance increases in distinguished (image
dependent) jumps as the complexity is increased. This enables us to
find optima with relatively high RD performance and relatively low
complexity for processing still images as well as video sequences.

3) Title: Efficient Pre-Coding Techniques for Wavelet-Based Image Compression

Authors: Detlev Marpe and Hans L. Cycon

URL: ftp://ftp.rz.fhtw-berlin.de/pub/fhtw/fb3/pcs97.ps.gz

Status: Presented on Picture Coding Symposium, Berlin, Sept. 1997.

Abstract: The principle of transform coding is a successfully established
concept in image compression. In this paper we introduce a coding
method using a fast wavelet transform and an uniform quantizer
combined with a new framework of pre-coding techniques which are
based on the concepts of partitioning, aggregation and conditional
coding (PACC). Following these concepts, the data object emerging
from the quantizer will be first partitioned into different
subsources. Parts of correlations within and between different
subsources will then be captured by aggregating homogeneous elements
into data structures like run-length codes or zerotrees. By using
models based on conditional probabilities we are able to recover
correlations between the structures constructed before as well as
cross-correlations between different subsources which will be
utilized in a final arithmetic coding stage. Experimental results
show that our proposed coding methods have a rate-distortion (RD)
performance comparable to or even better than the best zerotree-based
still image coders in the published literature with the advantage of
a less demanding computational complexity. In addition, we propose
and evaluate a wavelet-based video coding algorithm which outperforms
the very efficient MPEG-4 Video Verification Model (VM 5.1) in both
subjective and objective quality.

Thanks.

Detlev Marpe
Wavelet Project
Fachhochschule fuer Technik u. Wirtschaft
Allee der Kosmonauten 20-22
10315 Berlin
GERMANY
e-mail: dmarpe@fhtw-berlin.de
All times are GMT + 1 Hour
Page 1 of 1

 
Jump to: 
 


disclaimer - webmaster@wavelet.org
Powered by phpBB

This page was created in 0.026416 seconds : 18 queries executed : GZIP compression disabled