# Correlograms

### Also covariances

Usefulness: đ§ đ§
Novelty: đĄ
Uncertainty: đ€Ș đ€Ș đ€Ș
Incompleteness: đ§ đ§ đ§

This material is revised and expanded from the appendix of draft versions of a recent conference submission, for my own reference. I used correlograms a lot in that, but it was startling that despite being simple and, to my mind, non-controversial, it is hard to find a decent summary of their properties anywhere. Nothing new here, but do see also Wiener-Khintchine and covariance kernels for some related stuff.

Credit to Ning Ma:

$\renewcommand{\var}{\operatorname{Var}} \renewcommand{\dd}{\mathrm{d}} \renewcommand{\pd}{\partial} \renewcommand{\bb}[1]{\mathbb{#1}} \renewcommand{\vv}[1]{\boldsymbol{#1}} \renewcommand{\mm}[1]{\boldsymbol{#1}} \renewcommand{\mmm}[1]{\mathrm{#1}} \renewcommand{\cc}[1]{\mathcal{#1}} \renewcommand{\ff}[1]{\mathfrak{#1}} \renewcommand{\oo}[1]{\operatorname{#1}} \renewcommand{\gvn}{\mid} \renewcommand{\II}[1]{\mathbb{I}\{#1\}} \renewcommand{\inner}[2]{\langle #1,#2\rangle} \renewcommand{\Inner}[2]{\left\langle #1,#2\right\rangle} \renewcommand{\finner}[3]{\langle #1,#2;#3\rangle} \renewcommand{\FInner}[3]{\left\langle #1,#2;#3\right\rangle} \renewcommand{\dinner}[2]{[ #1,#2]} \renewcommand{\DInner}[2]{\left[ #1,#2\right]} \renewcommand{\norm}[1]{\| #1\|} \renewcommand{\Norm}[1]{\left\| #1\right\|} \renewcommand{\fnorm}[2]{\| #1;#2\|} \renewcommand{\FNorm}[2]{\left\| #1;#2\right\|} \renewcommand{\argmax}{\mathop{\mathrm{argmax}}} \renewcommand{\argmin}{\mathop{\mathrm{argmin}}} \renewcommand{\omp}{\mathop{\mathrm{OMP}}}$

Consider an $$L_2$$ signal $$f: \bb{R}\to\bb{R}.$$ We will frequently overload notation and refer to as signal with free argument $$t$$, so that $$f(rt-\xi),$$ for example, refers to the signal $$t\mapsto f(rt-\xi).$$ We write the inner product between signals $$t\mapsto f(t)$$ and $$t\mapsto f'(t)$$ as $$\inner{f(t)}{f'(t)}$$. Where it is not clear that the free argument is, e.g.Â $$t$$, we will annotate it $$\finner{f(t)}{f'(t)}{t}$$.

The correlogram $$\cc{A}:L_2(\bb{R}) \to L_2(\bb{R})$$ maps signals to signals. Specifically, $$\mathcal{A}\{f\}$$ is a signal $$\bb{R}\to\bb{R}$$ such that

$\mathcal{A}\{f\}:=\xi \mapsto \finner{ f(t) }{ f(t-\xi) }{t}$ This is the covariance between $$f(t)$$ and $$f(t-\xi).$$ (Note that we here discuss the covariance between given deterministic signals, not between two stochastic sources; covariance of stochastic processes is a broader, let alone inferring the covariance of stochastic processes.) Note also that this is what I would call an autocovariance not an auto-correlation, since itâs not normalized, but Iâll stick with the latter for now since for reasons of convention.

We derive the properties of this transform.

Multiplication by a constant. Consider a constant $$c\in \bb{R}.$$

\begin{aligned}\mathcal{A}\{cf\}(\xi)&= \inner{ cf(t) }{ cf(t-\xi) }\\ &= c^2\finner{ f(t) }{ f(t-\xi) }{t}\\ &= c^2\mathcal{A}\{f\}(\xi).\\ \end{aligned}

Time scaling:

\begin{aligned}\mathcal{A}\{f(r t)\}(\xi) &=\finner{ f(r t) }{ f(r t-\xi) }{t}\\ &= \int f(r t)f(r t-\xi)\dd t\\ &= \frac{1}{r }\int f(t)f(t-\frac{\xi}{r})\dd t\\ &= \frac{1}{r} \mathcal{A}\{f\}\left(\frac{\xi}{r}\right)\\ \end{aligned}

Addition:

\begin{aligned}\mathcal{A}\{f+f'\}(\xi) &=\finner{ f(t)+f'(t) }{ f(t-\xi)+f'(t-\xi) }{t}\\ &=\finner{ f(t) }{ f(t-\xi)\rangle+\langle f(t),f'(t-\xi) }{t} +\finner{ f'(t) }{ f(t-\xi)\rangle+\langle f'(t),f'(t-\xi) }{t}\\ &= \mathcal{A}\{f\}(\xi)+ \finner{ f'(t) }{ f(t-\xi)}{t} +\finner{f(t)}{f'(t-\xi) }{t} +\mathcal{A}\{f'\}(\xi).\\ &= \mathcal{A}\{f\}(\xi)+ \finner{ f'(t) }{ f(t-\xi)}{t} +\finner{f(t+\xi)}{f'(t) }{t} +\mathcal{A}\{f'\}(\xi).\\ &= \mathcal{A}\{f\}(\xi)+ \finner{ f'(t) }{ f(t-\xi)}{t} +\finner{f'(t) }{f(t+\xi)}{t} +\mathcal{A}\{f'\}(\xi).\\ \end{aligned}

We can say little about the term $$\finner{ f'(t) }{ f(t-\xi)}+\finner{f'(t) }{f(t+\xi)}{t}$$ without more information about the signals in question. However, we can solve a randomized version. Suppose $$S_i, \, i \in\bb{N}$$ are i.i.d. Rademacher variables, i.e.Â that they assume a value in $$\{+1,-1\}$$ with equal probability. Then, we can introduce the following property:

Randomised addition:

\begin{aligned} \bb{E}[ \mathcal{A}\{S_1f + S_2f'\}(\xi) &=\bb{E}[ \mathcal{A}\{S_1f\}(\xi) + \finner{ S_2 f'(t) }{ S_1 f(t-\xi)}{t} +\finner{S_2f'(t) }{S_1 f(t+\xi)}{t} +\mathcal{A}\{S_2f'\}(\xi)]\\ &=\bb{E}[ \mathcal{A}\{S_1f\}(\xi)] + \bb{E}\finner{ S_2 f'(t) }{ S_1 f(t-\xi)}{t} + \bb{E}\finner{S_2f'(t) }{S_1 f(t+\xi)}{t} +\bb{E}[ \mathcal{A}\{S_2f'\}(\xi)]\\ &=\mathcal{A}\{f\}(\xi)+ \bb{E}[ S_1S_2]\finner{ f'(t) }{ f(t-\xi) }{t} + \bb{E}[ S_1S_2]\finner{ f'(t) }{ f(t+\xi) }{t}+\mathcal{A}\{f'\}(\xi)\\ &=\mathcal{A}\{f\}(\xi)+ \mathcal{A}\{f'\}(\xi)\\ \end{aligned}

# Refs

Abrahamsen, Petter. 1997. âA Review of Gaussian Random Fields and Correlation Functions.â http://publications.nr.no/publications.nr.no/directdownload/publications.nr.no/rask/old/917_Rapport.pdf.

Bochner, Salomon. 1959. Lectures on Fourier Integrals. Princeton University Press. http://books.google.com?id=MWCYDwAAQBAJ.

Brown, Judith C., and Miller S. Puckette. 1989. âCalculation of a ââNarrowedââ Autocorrelation Function.â The Journal of the Acoustical Society of America 85 (4): 1595â1601. https://doi.org/10.1121/1.397363.

Cariani, P. A., and B. Delgutte. 1996. âNeural Correlates of the Pitch of Complex Tones. I. Pitch and Pitch Salience.â Journal of Neurophysiology 76 (3): 1698â1716. https://doi.org/10.1152/jn.1996.76.3.1698.

CheveignĂ©, Alain de, and Hideki Kawahara. 2002. âYIN, a Fundamental Frequency Estimator for Speech and Music.â The Journal of the Acoustical Society of America 111 (4): 1917â30. https://doi.org/10.1121/1.1458024.

Kaso, Artan. 2018. âComputation of the Normalized Cross-Correlation by Fast Fourier Transform.â PLOS ONE 13 (9): e0203434. https://doi.org/10.1371/journal.pone.0203434.

Khintchine, A. 1934. âKorrelationstheorie der stationĂ€ren stochastischen Prozesse.â Mathematische Annalen 109 (1): 604â15. https://doi.org/10.1007/BF01449156.

Langner, Gerald. 1992. âPeriodicity Coding in the Auditory System.â Hearing Research 60 (2): 115â42. https://doi.org/10.1016/0378-5955(92)90015-F.

Lewis, J. P. n.d. âFast Template Matching.â In. Accessed June 4, 2019. http://www.scribblethink.org/Work/nvisionInterface/vi95_lewis.pdf.

Licklider, J. C. R. 1951. âA Duplex Theory of Pitch Perception.â Experientia 7 (4): 128â34. https://doi.org/10.1007/BF02156143.

Ma, Ning, Phil Green, Jon Barker, and AndrĂ© Coy. 2007. âExploiting Correlogram Structure for Robust Speech Recognition with Multiple Speech Sources.â Speech Communication 49 (12): 874â91. https://doi.org/10.1016/j.specom.2007.05.003.

Morales-Cordovilla, J. A., A. M. Peinado, V. Sanchez, and J. A. Gonzalez. 2011. âFeature Extraction Based on Pitch-Synchronous Averaging for Robust Speech Recognition.â IEEE Transactions on Audio, Speech, and Language Processing 19 (3): 640â51. https://doi.org/10.1109/TASL.2010.2053846.

Rabiner, L. 1977. âOn the Use of Autocorrelation Analysis for Pitch Detection.â IEEE Transactions on Acoustics, Speech, and Signal Processing 25 (1): 24â33. https://doi.org/10.1109/TASSP.1977.1162905.

Slaney, M., and R. F. Lyon. 1990. âA Perceptual Pitch Detector.â In Proceedings of ICASSP, 357â60 vol.1. https://doi.org/10.1109/ICASSP.1990.115684.

Sondhi, M. 1968. âNew Methods of Pitch Extraction.â IEEE Transactions on Audio and Electroacoustics 16 (2): 262â66. https://doi.org/10.1109/TAU.1968.1161986.

Tan, L. N., and A. Alwan. 2011. âNoise-Robust F0 Estimation Using SNR-Weighted Summary Correlograms from Multi-Band Comb Filters.â In 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 4464â7. https://doi.org/10.1109/ICASSP.2011.5947345.

Wiener, Norbert. 1930. âGeneralized Harmonic Analysis.â Acta Mathematica 55: 117â258. https://doi.org/10.1007/BF02546511.