Return to the homepage
Search the complete Wavelet Digest database
Help about the Wavelet Digest mailing list
About the Wavelet Digest
The Wavelet Digest
> Volume 4, Issue 8
Preprint: New denoising papers available from Stanford

Previous :: Next

Author 
Message 
David Donoho (donoho@playfair.Stanford.EDU) Guest

Posted: Tue Dec 03, 2002 1:30 pm Subject: Preprint: New denoising papers available from Stanford




Preprint: New denoising papers available from Stanford
DeNoising Papers available by FTP / WWW
The following preprints are available electronically:
1. WaveLab and Reproducible Research, by J. Buckheit and D. Donoho
2. Atomic Decomposition by Basis Pursuit, by S. Chen and D. Donoho
3. TranslationInvariant DeNoising, by R. Coifman and D. Donoho
4. Wavelet threshold estimators for data with correlated noise
by I. Johnstone and B. Silverman
5. Ideal Denoising in an orthonormal basis chosen from a library of bases
by D. Donoho and I. Johnstone
The papers are available in compressed postscript format (.ps.Z).
Instructions for WWW Access:
* Open the URL http://playfair.stanford.edu/subjects/wavelets.html
* Click on the name of the paper you desire.
Instructions for FTP Access:
The FTP addresses of this papers are
1. ftp://playfair.stanford.edu/pub/buckheit/wavelab.ps.Z
2. ftp://playfair.stanford.edu/pub/chen_s/BasisPursuit.ps.Z
3. ftp://playfair.stanford.edu/pub/donoho/TIDeNoise.ps.Z
4. ftp://playfair.stanford.edu/pub/johnstone/correlated.ps.Z
5. ftp://playfair.stanford.edu/pub/donoho/idealbasis.ps.Z
To get #1, for example, do the following
anonymous ftp to playfair.stanford.edu
cd pub/buckheit
binary
get wavelab.ps.Z
Instructions for the other papers are similar.
Instructions for printing:
UNIX users should remember that in many systems "lpr file.ps"
will not work properly if file.ps is greater than 1.25MB in length.
It is probably necessary to find a workaround for large postscript files
on your UNIX. On many UNIX systems, "lpr s file.ps" will work.
Questions: donoho@playfair.stanford.edu
Abstracts:
itle{WaveLab and Really Reproducible Research}
author{Jon Buckheit & David L. Donoho \
Statistics Dept., Stanford University}
egin{abstract} WaveLab is a library of Matlab routines for wavelet
analysis, waveletpacket analysis, cosinepacket analysis and matching
pursuit. The library is available free of charge over the Internet.
Versions are provided for Macintosh, UNIX and Windows machines.
WaveLab makes available, in one package, all the code to reproduce all
the figures in our published wavelet articles. The interested reader can
inspect the source code to see exactly what algorithms were used, how
parameters were set in producing our figures, and can then modify the
source to produce variations on our results. WaveLab has been developed,
in part, because of exhortations by Jon Claerbout of Stanford that
computational scientists should engage in ``really reproducible"
research. end{abstract}
itle{Atomic Decomposition by Basis Pursuit}
author{Shaobing Chen & David L. Donoho \
Statistics Dept., Stanford University}
egin{abstract} The TimeFrequency and TimeScale communities have
recently developed a large number of overcomplete waveform dictionaries 
stationary wavelets, wavelet packets, cosine packets, chirplets, and
warplets, to name a few. Decomposition into overcomplete systems is not
unique, and several methods for decomposition have been proposed 
including the Method of Frames, Matching Pursuit, and, for special
dictionaries, the Best Orthogonal Basis.
Basis Pursuit is a principle for decomposing a signal into an ``optimal"
superposition of dictionary elements  where optimal means having the
smallest $l^1$ norm of coefficients among all such decompositions. The
principle exhibits several advantages over the Method of Frames, Matching
Pursuit and Best Ortho Basis, including better sparsity, and
superresolution. BP has interesting relations to ideas in areas as
diverse as illposed problems, in abstract harmonic analysis, total
variation denoising, and multiscale edge denoising.
Basis Pursuit in highly overcomplete dictionaries leads to largescale
optimization problems. With signals of length 8192 and the wavelet packet
dictionary, one gets an equivalent linear program of size 8192 by 212,992.
Such problems can be attacked successfully because of recent advances in
linear programming by interior point methods. We obtain reasonable success
with a primaldual logarithmic potential method and conjugate gradient
solver. end{abstract}
{f Key Words and Phrases.} Overcomplete signal representation,
TimeFrequency Analysis, TimeScale Analysis, $ell^1$ norm optimization,
Matching Pursuit, Wavelets. Wavelet Packets, Cosine Packets, Interior
point methods for linear programming, Total Variation denoising,
multiscale edges.
itle{TranslationInvariant DeNoising}
author{David Donoho\Department of Statistics\Stanford University
and
R.R. Coifman\Mathematics\Yale University}
egin{abstract} We discuss a method for making wavelet thresholding
denoising translation invariant. It is based on averaging the results of
processing data arising in all possible circulant shifts of the original
data. There are fast algorithms to accomplish this in order $n log n$
time. The method is related to simple thresholding in Mallat's
undecimated wavelet transform. The method has a number of performance
benefits; for example when used with the Haar wavelet it suppresses Gibbs
phenomena in expectation. Similar ideas make sense with cosine packets,
wavelets packets and other orthogonal bases. end{abstract}
itle{Wavelet threshold estimators for data with correlated noise}
author{Iain M. Johnstone\Department of Statistics\Stanford
University\Stanford CA 94305\U.S.A. and
Bernard W. Silverman\School of
Mathematics\Bristol University\Bristol BS8 1TW %
egin{abstract}
Wavelet threshold estimators for data with stationary correlated noise are
constructed by the following prescription. First, form the discrete
wavelet transform of the data points. Next, apply a {em leveldependent}
soft threshold to the individual coefficients, allowing the thresholds to
depend on the level in the wavelet transform. Finally, transform back to
obtain the estimate in the original domain. The threshold used at level
$j$ is $s_j sqrt{2 log n}$, where $s_j$ is the standard deviation of the
coefficients at that level, and $n$ is the overall sample size. The
minimax properties of the estimators are investigated by considering a
general problem in multivariate normal decision theory, concerned with the
estimation of the mean vector of a general multivariate normal
distribution subject to squared error loss. An ideal risk is obtained by
the use of an `oracle' that provides the optimum diagonal projection
estimate. This `benchmark' risk can be considered in its own right as a
measure of the sparseness of the signal relative to the noise process, and
in the wavelet context it can be considered as the risk obtained by ideal
spatial adaptivity. It is shown that the leveldependent threshold
estimator performs well relative to the benchmark risk, and that its
minimax behaviour cannot be improved upon in order of magnitude by any
other estimator. end{abstract}
{it Key Words and Phrases:} Decision
theory, Leveldependent thresholding, Minimax estimation, Nonlinear
estimators, Nonparametric regression, Oracle inequality, Wavelet
transform.
itle{Ideal Denoising in an orthonormal basis chosen from a library of
bases}
author{David L. Donoho & Iain M. Johnstone\
Department of Statistics\ Stanford University}
egin{abstract} Suppose we have observations $y_i = s_i + z_i$,
$i=1,...,n$, where $(s_i)$ is signal and $(z_i)$ is i.i.d. Gaussian white
noise. Suppose we have available a library $cL$ of orthogonal bases, such
as the Wavelet Packet bases or the Cosine Packet bases of Coifman and
Meyer. We wish to select, adaptively based on the noisy data $(y_i)$, a
basis in which best to recover the signal (``denoising"). Let $M_n$ be
the total number of distinct vectors occcuring among all bases in the
library and let $t_n = sqrt{2 log(M_n)}$. (For wavelet packets, $M_n = n
log_2(n)$.)
Let $y[cB]$ denote the original data $y$ transformed into the Basis
$cB$. Choose $lambda > 8$ and set $Lambda_n = (lambda cdot (1 +
t_n))^2$. Define the entropy functional
[
cE_lambda(y,cB) = sum_i min(y_i^2[cB], Lambda_n^2 ) .
]
Let $hat{cB}$ be the best orthogonal basis according to this entropy:
[
hat{cB} = argmin_{cB in cL} cE_lambda(y,cB) .
]
Define the hardthreshold nonlinearity $eta_t(y) = y 1_{{y > t}}$.
In the empirical best basis, apply hardthresholding with threshold $t =
sqrt{Lambda_n}$:
[
hat{s}_i^*[hat{cB}] = eta_{sqrt{Lambda_n}} (y_i[hat{cB}]) .
]
{it Theorem:} {sl With probability exceeding $pi_n = 1e/M_n$,
[
hat{s}^*  s_2^2 leq (18/lambda)^{1} cdot Lambda_n cdot
min_{cB in cL} E  hat{s}_{cB}  s _2^2 .
]
Here the minimum is over all ideal procedures working in all bases of the
library, i.e. in basis $cB$,
$hat{s}_{cB}$ is just $y_i[{cB}] 1_{{s_i[cB] > 1}}$.}
In short, the basisadaptive estimator achieves a loss within a
logarithmic factor of the ideal risk which would be achievable if one had
available an oracle which would supply perfect information about the ideal
basis in which to denoise, and also about which coordinates were large or
small.
The result extends in obvious ways to more general orthogonal basis
libraries, basically to any libraries constructed from an atmost
polynomiallygrowing number of coefficient functionals. Parallel results
can be developed for closely related entropies. end{abstract}
{f Key Words.} Wavelet Packets, Cosine Packets, weak$ell^p$ spaces.
Adaptive Basis Selection. Oracles for adaptation. Thresholding of Wavelet
Coefficients. 





All times are GMT + 1 Hour

Page 1 of 1 
