## Signal processing/spectral uncertainties

TBD. see wikipedia for now.

## Entropic uncertainty

\begin{equation*} g(y)\approx \int _{{-\infty }}^{\infty }\exp(-2\pi ixy)f(x)\,dx,\qquad f(x)\approx \int _{{-\infty }}^{\infty }\exp(2\pi ixy)g(y)\,dy~, \end{equation*}where the “≈” indicates convergence in \(L_2\), and normalized so that (by Plancherel’s theorem),

\begin{equation*} \int _{{-\infty }}^{\infty }|f(x)|^{2}\,dx=\int _{{-\infty }}^{\infty }|g(y)|^{2}\,dy=1~. \end{equation*}He showed that for any such functions the sum of the Shannon entropies is non-negative,

\begin{equation*} H(|f|^{2})+H(|g|^{2})\equiv -\int _{{-\infty }}^{\infty }|f(x)|^{2}\log |f(x)|^{2}\,dx-\int _{{-\infty }}^{\infty }|g(y)|^{2}\log |g(y)|^{2}\,dy\geq 0. \end{equation*}\begin{equation*} H(|f|^{2})+H(|g|^{2})\geq \log {\frac e2}~, \end{equation*}was conjectured by Hirschman and Everett, proven in 1975 by Beckner and in the same year interpreted by as a generalized quantum mechanical uncertainty principle by Białynicki-Birula and Mycielski. The equality holds in the case of Gaussian distributions.