# Restricted isometry properties

### Plus incoherence, irrepresentability, and other uncertainty bounds for a sparse world

Restricted isometry properties, a.k.a. uniform uncertainty principles (CaTa05, CaRT06), mutual incoherence (DoET06), irrepresentability conditions (ZhYu06)…

Recoverability conditions, as seen in sparse regression, sparse basis dictionaries, function approximation, compressed sensing etc. Uncertainty principles for a sparse world.

Terry Tao mentions the various related conditions for the compressed sensing problem, and which types of random matrices satisfy them. The casual lecture notes on uniform uncertainty look nice.

## Restricted Isometry

The compressed sensing formulation.

The restricted isometry constant of a matrix $A$, is the smallest constant $\delta_s$ $(1-\delta_s(A))\|x\|_2^2\leq \|Ax\|_2^2\leq (1+\delta_s(A))\|x\|_2^2$ for all $s$-sparse $x$. That is, the measurement matrix does not change the norm of sparse signals “too much”, and in particular, does not null them when $\delta_s \lt 1.$

TBD.

TBD.

## Refs

BDDW08
Baraniuk, R., Davenport, M., DeVore, R., & Wakin, M. (2008) A Simple Proof of the Restricted Isometry Property for Random Matrices. Constructive Approximation, 28(3), 253–263. DOI.
BCDH10
Baraniuk, R. G., Cevher, V., Duarte, M. F., & Hegde, C. (2010) Model-based compressive sensing. IEEE Transactions on Information Theory, 56(4), 1982–2001. DOI.
CaXZ08
Cai, T. T., Xu, G., & Zhang, J. (2008) On Recovery of Sparse Signals via $ell_1$ Minimization. ArXiv:0805.0149 [Cs].
CENR11
Candès, E. J., Eldar, Y. C., Needell, D., & Randall, P. (2011) Compressed sensing with coherent and redundant dictionaries. Applied and Computational Harmonic Analysis, 31(1), 59–73. DOI.
CaRT06
Candès, E. J., Romberg, J. K., & Tao, T. (2006) Stable signal recovery from incomplete and inaccurate measurements. Communications on Pure and Applied Mathematics, 59(8), 1207–1223. DOI.
CaTa00
Candès, E. J., & Tao, T. (n.d.) The uniform uncertainty principle and compressed sensing.
CaTa05
Candès, E., & Tao, T. (2005) Decoding by linear programming. IEEE Transactions on Information Theory, 51(12), 4203–4215. DOI.
DDFG10
Daubechies, I., DeVore, R., Fornasier, M., & Güntürk, C. S.(2010) Iteratively reweighted least squares minimization for sparse recovery. Communications on Pure and Applied Mathematics, 63(1), 1–38. DOI.
DoET06
Donoho, D. L., Elad, M., & Temlyakov, V. N.(2006) Stable recovery of sparse overcomplete representations in the presence of noise. IEEE Transactions on Information Theory, 52(1), 6–18. DOI.
FGLE12
Flammia, S. T., Gross, D., Liu, Y.-K., & Eisert, J. (2012) Quantum Tomography via Compressed Sensing: Error Bounds, Sample Complexity, and Efficient Estimators. New Journal of Physics, 14(9), 095022. DOI.
FoSr11
Foygel, R., & Srebro, N. (2011) Fast-rate and optimistic-rate error bounds for L1-regularized regression. ArXiv:1108.0373 [Math, Stat].
HeBa12
Hegde, C., & Baraniuk, R. G.(2012) Signal Recovery on Incoherent Manifolds. IEEE Transactions on Information Theory, 58(12), 7204–7214. DOI.
MeYu09
Meinshausen, N., & Yu, B. (2009) Lasso-type recovery of sparse representations for high-dimensional data. The Annals of Statistics, 37(1), 246–270. DOI.
NeVe09
Needell, D., & Vershynin, R. (2009) Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit. Foundations of Computational Mathematics, 9(3), 317–334. DOI.
RWRY11
Ravikumar, P., Wainwright, M. J., Raskutti, G., & Yu, B. (2011) High-dimensional covariance estimation by minimizing ℓ1-penalized log-determinant divergence. Electronic Journal of Statistics, 5, 935–980. DOI.
RiGr14
Rish, I., & Grabarnik, G. (2014) Sparse Signal Recovery with Exponential-Family Noise. In A. Y. Carmi, L. Mihaylova, & S. J. Godsill (Eds.), Compressed Sensing & Sparse Filtering (pp. 77–93). Springer Berlin Heidelberg DOI.
SoMe17
Soni, A., & Mehdad, Y. (2017) RIPML: A Restricted Isometry Property based Approach to Multilabel Learning. ArXiv:1702.05181 [Cs, Stat].
Geer14
van de Geer, S. (2014) Worst possible sub-directions in high-dimensional models. In arXiv:1403.7023 [math, stat] (Vol. 131).
WuHC13
Wu, R., Huang, W., & Chen, D. R.(2013) The Exact Support Recovery of Sparse Signals With Noise via Orthogonal Matching Pursuit. IEEE Signal Processing Letters, 20(4), 403–406. DOI.
Zhan10
Zhang, C.-H. (2010) Nearly unbiased variable selection under minimax concave penalty. The Annals of Statistics, 38(2), 894–942. DOI.
ZhYu06
Zhao, P., & Yu, B. (2006) On model selection consistency of Lasso. Journal of Machine Learning Research, 7(Nov), 2541–2563.