# Uncertainty principles

## Signal processing/spectral uncertainties

TBD. see wikipedia for now.

## Entropic uncertainty

$$g(y)\approx \int _{{-\infty }}^{\infty }\exp(-2\pi ixy)f(x)\,dx,\qquad f(x)\approx \int _{{-\infty }}^{\infty }\exp(2\pi ixy)g(y)\,dy~,$$

where the “≈” indicates convergence in $$L_2$$, and normalized so that (by Plancherel’s theorem),

$$\int _{{-\infty }}^{\infty }|f(x)|^{2}\,dx=\int _{{-\infty }}^{\infty }|g(y)|^{2}\,dy=1~.$$

He showed that for any such functions the sum of the Shannon entropies is non-negative,

$$H(|f|^{2})+H(|g|^{2})\equiv -\int _{{-\infty }}^{\infty }|f(x)|^{2}\log |f(x)|^{2}\,dx-\int _{{-\infty }}^{\infty }|g(y)|^{2}\log |g(y)|^{2}\,dy\geq 0.$$

$$H(|f|^{2})+H(|g|^{2})\geq \log {\frac e2}~,$$

was conjectured by Hirschman and Everett, proven in 1975 by Beckner and in the same year interpreted by as a generalized quantum mechanical uncertainty principle by Białynicki-Birula and Mycielski. The equality holds in the case of Gaussian distributions.