Deconvolution

April 20, 2015 — April 11, 2016

convolution
density
functional analysis
linear algebra
nonparametric
probability
signal processing
sparser than thou
statistics
Figure 1

I wish, for a project of my own, to know about how to deconvolve with

  1. High dimensional data
  2. irregularly sampled data
  3. inhomogenous (although known) convolution kernels

This is in a signal processing setting; for the (closely-related) kernel-density estimation in a statistical setting, see kernel approximation. If you don’t know your noise spectrum, see blind deconvolution.

1 Vanilla deconvolution

Wiener filtering! Deconvolving a signal convolved with a known kernel. Say, reconstructing the pure sound of an instrument, or, the sound of the echo in a church, from a recording made in a reverberant church. It’s not purely acoustic, though; applies to images, abstract wacky function spaces etc. The procedure is to presume your signal has been blurred by (or generally convolved with) some filter, and then to find a new filter that undoes the effects of the previous filter, or as close as possible to that, since not all filters are invertible. In the basic case, then, this is approximately the same thing as filter inversion, although there are some fiddly cases when the kernel is noninvertible, or the inverse transform is unstable.

Linear versions are (apparently) straightforward Wiener filters (more-or-less generalized Kalman filters; although I think they are historically prior. 🏗 make this precise.) Clearly you get deconvolution-like behaviour in state filters sometimes too. I should inspect the edges of these definitions to work out the precise intersection. Markovian case?

Non-linear “deconvolutions” are AFAIK not strictly speaking deconvolution since convolution is a linear operation. Anyway, that usage seems to be in the literature. c.f. the “iterative Richardson-Lucy algorithm”.

See also compressive sensing, convolution kernels.

2 Deconvolution method in statistics

Curses! I thought that this weird obvious idea was actually a new idea of mine. Turns out it’s old. “Density deconvolution” is a keyword here, and it’s reasonably common in hierarchical models.

3 References

Ahmed, Recht, and Romberg. 2012. Blind Deconvolution Using Convex Programming.” arXiv:1211.5608 [Cs, Math].
Almeida, and Figueiredo. 2013. Blind Image Deblurring with Unknown Boundaries Using the Alternating Direction Method of Multipliers.” In 2013 IEEE International Conference on Image Processing.
Babacan, Molina, Do, et al. 2012. Bayesian Blind Deconvolution with General Sparse Image Priors.” In Computer Vision – ECCV 2012. Lecture Notes in Computer Science 7577.
Benichoux, Vincent, and Gribonval. 2013. A Fundamental Pitfall in Blind Deconvolution with Sparse and Shift-Invariant Priors.” In ICASSP-38th International Conference on Acoustics, Speech, and Signal Processing-2013.
Berkhout, and Zaanen. 1976. A Comparison Between Wiener Filtering, Kalman Filtering, and Deterministic Least Squares Estimation*.” Geophysical Prospecting.
Carrasco, and Florens. 2011. Spectral Method for Deconvolving a Density.” Econometric Theory.
Carroll, and Hall. 1988. Optimal Rates of Convergence for Deconvolving a Density.” Journal of the American Statistical Association.
Cucala. 2008. Intensity Estimation for Spatial Point Processes Observed with Noise.” Scandinavian Journal of Statistics.
Delaigle, and Hall. 2015. Methodology for Non-Parametric Deconvolution When the Error Distribution Is Unknown.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Delaigle, Hall, and Meister. 2008. On Deconvolution with Repeated Measurements.” The Annals of Statistics.
Delaigle, and Meister. 2008. Density Estimation with Heteroscedastic Error.” Bernoulli.
Dokmanić, Parhizkar, Walther, et al. 2013. Acoustic Echoes Reveal Room Shape.” Proceedings of the National Academy of Sciences.
Efron. 2014. The Bayes Deconvolution Problem.”
Fan. 1991. On the Optimal Rates of Convergence for Nonparametric Deconvolution Problems.” The Annals of Statistics.
———. 1992. Deconvolution with Supersmooth Distributions.” Canadian Journal of Statistics.
Finzi, Bondesan, and Welling. 2020. Probabilistic Numeric Convolutional Neural Networks.” arXiv:2010.10876 [Cs].
Hall, and Meister. 2007. A Ridge-Parameter Approach to Deconvolution.” The Annals of Statistics.
Hawe, Kleinsteuber, and Diepold. 2013. Analysis Operator Learning and Its Application to Image Reconstruction.” IEEE Transactions on Image Processing.
Hua, and Sarkar. 1990. Matrix Pencil Method for Estimating Parameters of Exponentially Damped/Undamped Sinusoids in Noise.” IEEE Transactions on Acoustics, Speech and Signal Processing.
Laird. 1978. Nonparametric Maximum Likelihood Estimation of a Mixing Distribution.” Journal of the American Statistical Association.
Liu, Chang, and Ma. 2012. Blind Image Deblurring by Spectral Properties of Convolution Operators.” arXiv:1209.2082 [Cs].
Mallet. 1986. A Maximum Likelihood Estimation Method for Random Coefficient Regression Models.” Biometrika.
Meister. 2008. Deconvolution from Fourier-Oscillating Error Densities Under Decay and Smoothness Restrictions.” Inverse Problems.
Rigollet, and Weed. 2018. Entropic Optimal Transport Is Maximum-Likelihood Deconvolution.”
Roy, and Kailath. 1989. ESPRIT-Estimation of Signal Parameters via Rotational Invariance Techniques.” IEEE Transactions on Acoustics, Speech, and Signal Processing.
Schuler, Hirsch, Harmeling, et al. 2014. Learning to Deblur.” arXiv:1406.7444 [Cs].
Stefanski, and Carroll. 1990. Deconvolving Kernel Density Estimators.” Statistics.
Stockham, Cannon, and Ingebretsen. 1975. Blind Deconvolution Through Digital Signal Processing.” Proceedings of the IEEE.
Wörmann, Hawe, and Kleinsteuber. 2013. Analysis Based Blind Compressive Sensing.” IEEE Signal Processing Letters.