The Living Thing / Notebooks :

Learning on manifolds

Finding the lowest bit of a krazy straw, from the inside

Usefulness: 🔧
Novelty: 💡
Uncertainty: 🤪 🤪 🤪
Incompleteness: 🚧 🚧 🚧

A placeholder for learning on curved spaces. Not discussed: learning OF curved spaces.

Also: learning where there is an a priori manifold seems to also be a usage here? See the work of, e.g. Nina Miolane and collaborators on the Geomstats project.

Girolami et al discuss Langevin Monte Carlo in this context.

The below headings may one day be filled in.

Information Geometry

The unholy offspring of Fisher information and differential geometry, about which I know little except that it sounds like it should be intuitive. See also information criteria. I also know that even though this sounds intuitive, it is not mainstream and it has also not been especially useful to me even in places where it seemed that it should, at least not beyond the basic delta method.

Hamiltonian Monte Carlo

You can also discuss Hamiltonian Monte Carlo in this setting. I will not.

Natural gradient

See natural gradients.

Homogeneous probability

Albert Tarantola’s framing, from his maybe forthcoming manuscript. How does it relate to information geometry? I don’t know yet. Haven’t had time to read. Also not a very common phrasing, which is a danger sign.

To Read


Absil, P.-A, R Mahony, and R Sepulchre. 2008. Optimization Algorithms on Matrix Manifolds. Princeton, N.J.; Woodstock: Princeton University Press.

Amari, Shun-ichi. 1998. “Natural Gradient Works Efficiently in Learning.” Neural Computation 10 (2): 251–76.

Amari, Shunʼichi. 1987. “Differential Geometrical Theory of Statistics.” In Differential Geometry in Statistical Inference, 19–94.

———. 2001. “Information Geometry on Hierarchy of Probability Distributions.” IEEE Transactions on Information Theory 47: 1701–11.

Aswani, Anil, Peter Bickel, and Claire Tomlin. 2011. “Regression on Manifolds: Estimation of the Exterior Derivative.” The Annals of Statistics 39 (1): 48–81.

Barndorff-Nielsen, O E. 1987. “Differential and Integral Geometry in Statistical Inference.” In Differential Geometry in Statistical Inference. Sn Aarhus.

Betancourt, Michael, Simon Byrne, Sam Livingstone, and Mark Girolami. 2017. “The Geometric Foundations of Hamiltonian Monte Carlo.” Bernoulli 23 (4A): 2257–98.

Boumal, Nicolas. 2013. “On Intrinsic Cramér-Rao Bounds for Riemannian Submanifolds and Quotient Manifolds.” IEEE Transactions on Signal Processing 61 (7): 1809–21.

Boumal, Nicolas, Bamdev Mishra, P.-A. Absil, and Rodolphe Sepulchre. 2014. “Manopt, a Matlab Toolbox for Optimization on Manifolds.” Journal of Machine Learning Research 15: 1455–9.

Boumal, Nicolas, Amit Singer, P.-A. Absil, and Vincent D. Blondel. 2014. “Cramér-Rao Bounds for Synchronization of Rotations.” Information and Inference 3 (1): 1–39.

Carlsson, Gunnar, Tigran Ishkhanov, Vin de Silva, and Afra Zomorodian. 2008. “On the Local Behavior of Spaces of Natural Images.” International Journal of Computer Vision 76 (1): 1–12.

Chen, Minhua, J. Silva, J. Paisley, Chunping Wang, D. Dunson, and L. Carin. 2010. “Compressive Sensing on Manifolds Using a Nonparametric Mixture of Factor Analyzers: Algorithm and Performance Bounds.” IEEE Transactions on Signal Processing 58 (12): 6140–55.

Fernández-Martínez, J. L., Z. Fernández-Muñiz, J. L. G. Pallero, and L. M. Pedruelo-González. 2013. “From Bayes to Tarantola: New Insights to Understand Uncertainty in Inverse Problems.” Journal of Applied Geophysics 98 (November): 62–72.

Ge, Rong, and Tengyu Ma. 2017. “On the Optimization Landscape of Tensor Decompositions.” In Advances in Neural Information Processing Systems.

Girolami, Mark, and Ben Calderhead. 2011. “Riemann Manifold Langevin and Hamiltonian Monte Carlo Methods.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 73 (2): 123–214.

Hosseini, Reshad, and Suvrit Sra. 2015. “Manifold Optimization for Gaussian Mixture Models.” arXiv Preprint arXiv:1506.07677.

Lauritzen, S L. 1987. “Statistical Manifolds.” In Differential Geometry in Statistical Inference, 164. JSTOR.

Miolane, Nina, Johan Mathe, Claire Donnat, Mikael Jorda, and Xavier Pennec. 2018. “Geomstats: A Python Package for Riemannian Geometry in Machine Learning,” May.

Mosegaard, Klaus, and Albert Tarantola. 1995. “Monte Carlo Sampling of Solutions to Inverse Problems.” Journal of Geophysical Research 100 (B7): 12431.

Mukherjee, Sayan, Qiang Wu, and Ding-Xuan Zhou. 2010. “Learning Gradients on Manifolds.” Bernoulli 16 (1): 181–207.

Peters, Jan. 2010. “Policy Gradient Methods.” Scholarpedia 5 (11): 3698.

Steinke, Florian, and Matthias Hein. 2009. “Non-Parametric Regression Between Manifolds.” In Advances in Neural Information Processing Systems 21, 1561–8. Curran Associates, Inc.

Townsend, James, Niklas Koep, and Sebastian Weichwald. 2016. “Pymanopt: A Python Toolbox for Optimization on Manifolds Using Automatic Differentiation.” Journal of Machine Learning Research 17 (137): 1–5.

Wang, Yu Guang, and Xiaosheng Zhuang. 2016. “Tight Framelets and Fast Framelet Transforms on Manifolds,” August.

Xifara, T., C. Sherlock, S. Livingstone, S. Byrne, and M. Girolami. 2014. “Langevin Diffusions and the Metropolis-Adjusted Langevin Algorithm.” Statistics & Probability Letters 91 (Supplement C): 14–19.