The Wavelet Digest Homepage
Return to the homepage
Search the complete Wavelet Digest database
Help about the Wavelet Digest mailing list
About the Wavelet Digest
The Digest The Community
 Latest Issue  Back Issues  Events  Gallery
The Wavelet Digest
   -> Volume 2, Issue 10

Papers on wavelet de-noising available electronically.
images/spacer.gifimages/spacer.gif Reply into Digest
Previous :: Next  
Author Message
David Donoho, Department of Statistics, Stanford University.

PostPosted: Fri Nov 29, 2002 3:32 pm    
Subject: Papers on wavelet de-noising available electronically.
Reply with quote

Papers on wavelet de-noising available electronically.

To get these, use anonymous FTP to After
connecting "cd pub/reports", then "ls" to get a listing of available files,
and "get filename" to get a specific file. Files are in
either LaTex or Plain tex and figures are in postscript.

Questions? e-mail Iain Johnstone (
or Dave Donoho (

David L. Donoho, Iain M. Johnstone, G'erard Kerkyacharian
and Dominique Picard.
"Wavelet Shrinkage: Asymptopia"
TR 419, March 1993, Submitted for publication
asymp.tex (LaTex)

Considerable effort has been directed recently to develop
asymptotically minimax methods in problems of recovering
infinite-dimensional objects (curves, densities, spectral densities,
images) from noisy data. A rich and complex body of work has evolved,
with nearly- or exactly- minimax estimators being obtained for a
variety of interesting problems. Unfortunately, the results have
often not been translated into practice, for a variety of reasons --
sometimes, similarity to known methods, sometimes, computational
intractability , and sometimes, lack of spatial adaptivity. We
discuss a method for curve estimation based on $n$ noisy data; one
translates the empirical wavelet coefficients towards the origin by an
amount $sqrt{2 log(n)} sa /sqrt{n}$. The method is different from
methods in common use today, is computationally practical, and is
spatially adaptive; thus it avoids a number of previous objections to
minimax estimators. At the same time, the method is nearly minimax
for a wide variety of loss funct ions -- e.g. pointwise error, global
error measured in $L^p$ norms, pointwise and global error in
estimation of derivatives -- and for a wide range of smoothness
classes, including standard H"{o}lder classes, Sobolev classes, and
Bounded Variation. This is a much broader near-optimality than
anything previously proposed in the minimax literature. Finally, the
theory underlying the method is interesting, as it exploits a
correspondence between statistical questions and questions of optimal
recovery and information-based complexity.

David L. Donoho and Iain M. Johnstone
"Adapting to Unknown Smoothness via Wavelet Shrinkage
June 1993, Submitted for publication
ausws.tex (LaTex)

We attempt to recover a function of unknown smoothness from noisy,
sampled data. We introduce a procedure, {it SureShrink}, which
suppresses noise by thresholding the empirical wavelet coefficients.
The thresholding is adaptive: a threshold level is assigned to each
dyadic resolution level by the principle of minimizing the Stein
Unbiased Estimate of Risk ({it Sure}) for threshold estimates. The
computational effort of the overall procedure is order $N cdot
log(N)$ as a function of the sample size $N$.

{it SureShrink} is smoothness-adaptive: if the unknown function
contains jumps, the reconstruction (essentially) does also; if the
unknown function has a smooth piece, the reconstruction is
(essentially) as smooth as the mother wavelet will allow. The
procedure is in a sense optimally smoothness-adaptive: it is
near-minimax simultaneously over a whole interval of the Besov scale;
the size of this interval depends on the choice of mother wavelet. We
know from a previous paper by the authors that traditional smoothing
methods -- kernels, splines, and orthogonal series estimates -- even
with optimal choices of the smoothing parameter, would be unable to
perform in a near-minimax way over many spaces in the Besov scale.

Examples of {it SureShrink} are given: the advantages of the method
are particularly evident when the underlying function has jump
discontinuities on a smooth background.

David L. Donoho, Iain M. Johnstone, G'erard Kerkyacharian
and Dominique Picard.
"Density estimation by wavelet thresholding"
June 1993, Submitted for publication
dens.tex (LaTex)

Density estimation is a commonly used test case for non-parametric
estimation methods. We explore the asymptotic properties of estimators
based on thresholding of empirical wavelet coefficients. Minimax rates
of convergence are studied over a large range of Besov function
classes $B_{s,p,q}$ and for a range of global $L_p' $ error measures,
$1 leq p' < infty$. A single wavelet threshold estimator is
asymptotically minimax within logarithmic terms simultaneously over a
range of spaces and error measures. In particular, when $p' > p$, some
form of non-linearity is essential, since the minimax linear
estimators are suboptimal by polynomial powers of $n$. A second
approach, using an approximation of a Gaussian white noise model in a
Mallows metric, is used to attain exactly optimal rates of convergence
for quadratic error ($p' = 2$).

Iain M. Johnstone
"Minimax Bayes, asymptotic minimax and sparse wavelet priors"
TR 420, April 1993, To appear, Proc. 5th Purdue Symposium on Decision Theory.
priors.tex (LaTex)

Pinsker(1980) gave a precise asymptotic evaluation of the minimax mean
squared error of estimation of a signal in Gaussian noise when the
signal is known it a priori m to lie in a compact ellipsoid in
Hilbert space. This `Minimax Bayes' method can be applied to a
variety of global non-parametric estimation settings with parameter
spaces far from ellipsoidal. For example it leads to a theory of
exact asymptotic minimax estimation over norm balls in Besov and
Triebel spaces using simple co-ordinatewise estimators and wavelet

This paper outlines some features of the method common to several
applications. In particular, we derive new results on the exact
asymptotic minimax risk over weak $ell_p$- balls in ${cal R}^n$ as
$n ightarrow infty $, and also for a class of `local' estimators on
the Triebel scale.

By its very nature, the method reveals the structure of asymptotically
least favorable distributions. Thus we may simulate `least favorable'
sample paths. We illustrate this for estimation of a signal in
Gaussian white noise over norm balls in certain Besov spaces. In
wavelet bases, when $p < 2$, the least favorable priors are sparse,
and the resulting sample paths strikingly different from those
observed in Pinsker's ellipsoidal setting ($p=2$).

Smooth Wavelet Decompositions with blocky coefficient Kernels
David L. Donoho
May, 1993. To Appear: ``Recent Advances in Wavelet Analysis"
Schumaker and Webb, Eds. Academic Press.
blocky.tex (Plain Tex)
7 figures -- blockyps.shar

We describe bases of smooth
wavelets where the coefficients are obtained
by integration against (finite combinations of) boxcar kernels rather
than against traditional smooth wavelets. Bases of this type
were first developed in work of Tchamitchian
and of Cohen, Daubechies, and Feauveau.
Our approach emphasizes the idea of
{it average-interpolation} -- synthesizing
a smooth function on the line having prescribed boxcar averages --
and the link between average-interpolation and Dubuc-Deslauriers
interpolation. We also emphasize characterizations of smooth
functions via their coefficients.
We describe boundary-corrected expansions for the interval, which
have a simple and revealing form.
We use these results to re-interpret the {it empirical
wavelet transform} -- i.e. finite, discrete wavelet transforms
of data arising from boxcar integrators (e.g. CCD devices).

Wavelet Shrinkage and W.V.D. -- A ten-minute tour
David L. Donoho
January, 1993. To Appear, Proc. Toulouse Conference on Wavelets and Appl.,
S. Roques Ed., Springer-Verlag, 1993.
toulouse.tex (plain tex)
15 figures -- toulouseps.shar

A brief tour of applications of thresholding of wavelet coefficients
for denosing, and for solving inverse problems such as deconvolution
and numerical differentiation.
All times are GMT + 1 Hour
Page 1 of 1

Jump to: 

disclaimer -
Powered by phpBB

This page was created in 0.029497 seconds : 18 queries executed : GZIP compression disabled