The Wavelet Digest Homepage
Return to the homepage
Search the complete Wavelet Digest database
Help about the Wavelet Digest mailing list
About the Wavelet Digest
The Digest The Community
 Latest Issue  Back Issues  Events  Gallery
The Wavelet Digest
   -> Volume 4, Issue 8

Preprint: New de-noising papers available from Stanford
images/spacer.gifimages/spacer.gif Reply into Digest
Previous :: Next  
Author Message
David Donoho (donoho@playfair.Stanford.EDU)

PostPosted: Tue Dec 03, 2002 1:30 pm    
Subject: Preprint: New de-noising papers available from Stanford
Reply with quote

Preprint: New de-noising papers available from Stanford

De-Noising Papers available by FTP / WWW

The following preprints are available electronically:

1. WaveLab and Reproducible Research, by J. Buckheit and D. Donoho
2. Atomic Decomposition by Basis Pursuit, by S. Chen and D. Donoho
3. Translation-Invariant De-Noising, by R. Coifman and D. Donoho
4. Wavelet threshold estimators for data with correlated noise
by I. Johnstone and B. Silverman
5. Ideal Denoising in an orthonormal basis chosen from a library of bases
by D. Donoho and I. Johnstone

The papers are available in compressed postscript format (.ps.Z).

Instructions for WWW Access:

* Open the URL
* Click on the name of the paper you desire.

Instructions for FTP Access:

The FTP addresses of this papers are

To get #1, for example, do the following

anonymous ftp to
cd pub/buckheit

Instructions for the other papers are similar.

Instructions for printing:
UNIX users should remember that in many systems "lpr"
will not work properly if is greater than 1.25MB in length.
It is probably necessary to find a workaround for large postscript files
on your UNIX. On many UNIX systems, "lpr -s" will work.



itle{WaveLab and Really Reproducible Research}
author{Jon Buckheit & David L. Donoho \
Statistics Dept., Stanford University}

egin{abstract} WaveLab is a library of Matlab routines for wavelet
analysis, wavelet-packet analysis, cosine-packet analysis and matching
pursuit. The library is available free of charge over the Internet.
Versions are provided for Macintosh, UNIX and Windows machines.

WaveLab makes available, in one package, all the code to reproduce all
the figures in our published wavelet articles. The interested reader can
inspect the source code to see exactly what algorithms were used, how
parameters were set in producing our figures, and can then modify the
source to produce variations on our results. WaveLab has been developed,
in part, because of exhortations by Jon Claerbout of Stanford that
computational scientists should engage in ``really reproducible"
research. end{abstract}

itle{Atomic Decomposition by Basis Pursuit}
author{Shaobing Chen & David L. Donoho \
Statistics Dept., Stanford University}

egin{abstract} The Time-Frequency and Time-Scale communities have
recently developed a large number of overcomplete waveform dictionaries --
stationary wavelets, wavelet packets, cosine packets, chirplets, and
warplets, to name a few. Decomposition into overcomplete systems is not
unique, and several methods for decomposition have been proposed --
including the Method of Frames, Matching Pursuit, and, for special
dictionaries, the Best Orthogonal Basis.

Basis Pursuit is a principle for decomposing a signal into an ``optimal"
superposition of dictionary elements -- where optimal means having the
smallest $l^1$ norm of coefficients among all such decompositions. The
principle exhibits several advantages over the Method of Frames, Matching
Pursuit and Best Ortho Basis, including better sparsity, and
super-resolution. BP has interesting relations to ideas in areas as
diverse as ill-posed problems, in abstract harmonic analysis, total
variation de-noising, and multi-scale edge de-noising.

Basis Pursuit in highly overcomplete dictionaries leads to large-scale
optimization problems. With signals of length 8192 and the wavelet packet
dictionary, one gets an equivalent linear program of size 8192 by 212,992.
Such problems can be attacked successfully because of recent advances in
linear programming by interior point methods. We obtain reasonable success
with a primal-dual logarithmic potential method and conjugate gradient
solver. end{abstract}

{f Key Words and Phrases.} Overcomplete signal representation,
Time-Frequency Analysis, Time-Scale Analysis, $ell^1$ norm optimization,
Matching Pursuit, Wavelets. Wavelet Packets, Cosine Packets, Interior
point methods for linear programming, Total Variation de-noising,
multi-scale edges.

itle{Translation-Invariant De-Noising}
author{David Donoho\Department of Statistics\Stanford University
R.R. Coifman\Mathematics\Yale University}

egin{abstract} We discuss a method for making wavelet thresholding
de-noising translation invariant. It is based on averaging the results of
processing data arising in all possible circulant shifts of the original
data. There are fast algorithms to accomplish this in order $n log n$
time. The method is related to simple thresholding in Mallat's
undecimated wavelet transform. The method has a number of performance
benefits; for example when used with the Haar wavelet it suppresses Gibbs
phenomena in expectation. Similar ideas make sense with cosine packets,
wavelets packets and other orthogonal bases. end{abstract}

itle{Wavelet threshold estimators for data with correlated noise}
author{Iain M. Johnstone\Department of Statistics\Stanford
University\Stanford CA 94305\U.S.A. and
Bernard W. Silverman\School of
Mathematics\Bristol University\Bristol BS8 1TW %

Wavelet threshold estimators for data with stationary correlated noise are
constructed by the following prescription. First, form the discrete
wavelet transform of the data points. Next, apply a {em level-dependent}
soft threshold to the individual coefficients, allowing the thresholds to
depend on the level in the wavelet transform. Finally, transform back to
obtain the estimate in the original domain. The threshold used at level
$j$ is $s_j sqrt{2 log n}$, where $s_j$ is the standard deviation of the
coefficients at that level, and $n$ is the overall sample size. The
minimax properties of the estimators are investigated by considering a
general problem in multivariate normal decision theory, concerned with the
estimation of the mean vector of a general multivariate normal
distribution subject to squared error loss. An ideal risk is obtained by
the use of an `oracle' that provides the optimum diagonal projection
estimate. This `benchmark' risk can be considered in its own right as a
measure of the sparseness of the signal relative to the noise process, and
in the wavelet context it can be considered as the risk obtained by ideal
spatial adaptivity. It is shown that the level-dependent threshold
estimator performs well relative to the benchmark risk, and that its
minimax behaviour cannot be improved upon in order of magnitude by any
other estimator. end{abstract}

{it Key Words and Phrases:} Decision
theory, Level-dependent thresholding, Minimax estimation, Nonlinear
estimators, Nonparametric regression, Oracle inequality, Wavelet

itle{Ideal Denoising in an orthonormal basis chosen from a library of
author{David L. Donoho & Iain M. Johnstone\
Department of Statistics\ Stanford University}

egin{abstract} Suppose we have observations $y_i = s_i + z_i$,
$i=1,...,n$, where $(s_i)$ is signal and $(z_i)$ is i.i.d. Gaussian white
noise. Suppose we have available a library $cL$ of orthogonal bases, such
as the Wavelet Packet bases or the Cosine Packet bases of Coifman and
Meyer. We wish to select, adaptively based on the noisy data $(y_i)$, a
basis in which best to recover the signal (``de-noising"). Let $M_n$ be
the total number of distinct vectors occcuring among all bases in the
library and let $t_n = sqrt{2 log(M_n)}$. (For wavelet packets, $M_n = n

Let $y[cB]$ denote the original data $y$ transformed into the Basis
$cB$. Choose $lambda > 8$ and set $Lambda_n = (lambda cdot (1 +
t_n))^2$. Define the entropy functional
cE_lambda(y,cB) = sum_i min(y_i^2[cB], Lambda_n^2 ) .
Let $hat{cB}$ be the best orthogonal basis according to this entropy:
hat{cB} = argmin_{cB in cL} cE_lambda(y,cB) .

Define the hard-threshold nonlinearity $eta_t(y) = y 1_{{|y| > t}}$.
In the empirical best basis, apply hard-thresholding with threshold $t =
hat{s}_i^*[hat{cB}] = eta_{sqrt{Lambda_n}} (y_i[hat{cB}]) .

{it Theorem:} {sl With probability exceeding $pi_n = 1-e/M_n$,
|hat{s}^* - s|_2^2 leq (1-8/lambda)^{-1} cdot Lambda_n cdot
min_{cB in cL} E | hat{s}_{cB} - s |_2^2 .
Here the minimum is over all ideal procedures working in all bases of the
library, i.e. in basis $cB$,
$hat{s}_{cB}$ is just $y_i[{cB}] 1_{{|s_i[cB]| > 1}}$.}

In short, the basis-adaptive estimator achieves a loss within a
logarithmic factor of the ideal risk which would be achievable if one had
available an oracle which would supply perfect information about the ideal
basis in which to de-noise, and also about which coordinates were large or

The result extends in obvious ways to more general orthogonal basis
libraries, basically to any libraries constructed from an at-most
polynomially-growing number of coefficient functionals. Parallel results
can be developed for closely related entropies. end{abstract}

{f Key Words.} Wavelet Packets, Cosine Packets, weak-$ell^p$ spaces.
Adaptive Basis Selection. Oracles for adaptation. Thresholding of Wavelet
All times are GMT + 1 Hour
Page 1 of 1

Jump to: 

disclaimer -
Powered by phpBB

This page was created in 0.031863 seconds : 18 queries executed : GZIP compression disabled