“Gaussian processes” are processes with Gaussian marginal distributions, like Brownian motions and suchlike. Very prominent in, e.g. spatial statistics.

However, when you see it capitalised it tends to means a specific emphasis, on the use of these processes for regression, as nonparametric method with a conveniently Bayesian interpretation. The basic trick is, using covariance estimation and/or Gaussian process simulation on some clever Hilbert space to do functional regression.

I feel this is not too complex but I’ve not looked deeply into it. They reputedly work well with kernel methods to do machine learning stuff, apparently. The details of this are still hazy to me, and they aren’t currently on the correct side of the hype curve for me to dive in.

This web site aims to provide an overview of resources concerned with probabilistic modeling, inference and learning based on Gaussian processes. Although Gaussian processes have a long history in the field of statistics, they seem to have been employed extensively only in niche areas. With the advent of kernel machines in the machine learning community, models based on Gaussian processes have become commonplace for problems of regression (kriging) and classification as well as a host of more specialized applications.

I’ve not been very enthusiastic about these in the past for the reason of it not being worth it. It’s nice to have a principle nonparametric Bayesian formalism, but it’s pointless having a formalism that is so computationally demanding that people don’t try to use more than a thousand datapoints.

However, perhaps I should be persuaded by AutoGP (BoKD16) which breaks a lot of the awful computational deadlocks by clever use of inducing variables and variational approximation to produce a compressed representation of the data with tractable inference and model selection, including kernel selection, and doing the whole thing in many dimensions simultaneously.

## Implementations

The current scikit-learn has semi-fancy gaussian processes, and an introduction.

Gaussian Processes (GP) are a generic supervised learning method designed to solve regression and probabilistic classification problems.

The advantages of Gaussian processes are:

- The prediction interpolates the observations (at least for regular kernels).
- The prediction is probabilistic (Gaussian) so that one can compute empirical confidence intervals and decide based on those if one should refit (online fitting, adaptive fitting) the prediction in some region of interest.
- Versatile: different kernels can be specified. Common kernels are provided, but it is also possible to specify custom kernels.
The disadvantages of Gaussian processes include:

- They are not sparse, i.e., they use the whole samples/features information to perform the prediction.
- They lose efficiency in high dimensional spaces – namely when the number of features exceeds a few dozens.

Is that last point strictly true? Surely an appropriate kernel could ameliorate the dimensionality problem?

There are even fancier gaussian processes. Chris Fonnesbeck mentions GPflow, autogp, PyMC3, and the scikit-learn implementation. Plus I notice skgmm is a fancified version of the scikit-learn one. So… It’s easy enough to be bikeshedded is the message I’m getting here.

Questions:

- Can I infer a density using these? Or is it strictly in a regression/classification setting that the machinery works? (EDIT: Yes, you can.)
- Can you somehow make them (in some sense) sparse after all, using kernel approximation techniques? Is this what the variational version does?

## Kernels

a.k.a. covariance models.

GP models are the meeting of Covariance estimation and kernel machines.

### Matern

The Matérn stationary (and in the Euclidean case, isotropic) covariance function is one model for covariance. See Carl Edward Rasmussen’s Gaussian Process lecture notes for a readable explanation, or chapter 4 of his textbook (RaWi06).

### Cyclic

TBD

## Approximation with state filtering

Looks interesting. Without knowing enough about either to make an informed judgement, I imagine this makes the Gaussian process regression soluble by making it local, i.e. Markov, with respect to some assumed hidden state, in the same way Kalman filtering does Wiener filtering. This would address at least some of the criticisms about sparsity etc.

See Simo Särkkä’s work for that. (HaSä10, SäHa12,SäSH13_, KaSä16)

## Approximation with variational inference

TBD.

## Approximation with inducing variables

TBD.

## Approximation with variational inference and inducing variables

## Readings

This lecture by the late David Mackay is probably good; the man could talk.

## Refs

- Abra97
- Abrahamsen, P. (1997) A review of Gaussian random fields and correlation functions.
- AlSH04
- Altun, Y., Smola, A. J., & Hofmann, T. (2004) Exponential Families for Conditional Random Fields. In Proceedings of the 20th Conference on Uncertainty in Artificial Intelligence (pp. 2–9). Arlington, Virginia, United States: AUAI Press
- BiMa06
- Birgé, L., & Massart, P. (2006) Minimal Penalties for Gaussian Model Selection.
*Probability Theory and Related Fields*, 138(1–2), 33–73. DOI. - BoCW07
- Bonilla, E. V., Chai, K. M. A., & Williams, C. K. I.(2007) Multi-task Gaussian Process Prediction. In Proceedings of the 20th International Conference on Neural Information Processing Systems (pp. 153–160). USA: Curran Associates Inc.
- BoKD16
- Bonilla, E. V., Krauth, K., & Dezfouli, A. (2016) Generic Inference in Latent Gaussian Process Models.
*ArXiv:1609.00577 [Stat]*. - CBMF16
- Cutajar, K., Bonilla, E. V., Michiardi, P., & Filippone, M. (2016) Practical Learning of Deep Gaussian Processes via Random Fourier Features.
*ArXiv:1610.04386 [Stat]*. - CBMF17
- Cutajar, K., Bonilla, E. V., Michiardi, P., & Filippone, M. (2017) Random Feature Expansions for Deep Gaussian Processes. In PMLR.
- DLGT13
- Duvenaud, D., Lloyd, J., Grosse, R., Tenenbaum, J., & Zoubin, G. (2013) Structure Discovery in Nonparametric Regression through Compositional Kernel Search. In Proceedings of the 30th International Conference on Machine Learning (ICML-13) (pp. 1166–1174).
- Ebde15
- Ebden, M. (2015) Gaussian Processes: A Quick Introduction.
*ArXiv:1505.02965 [Math, Stat]*. - Emer07
- Emery, X. (2007) Conditioning Simulations of Gaussian Random Fields by Ordinary Kriging.
*Mathematical Geology*, 39(6), 607–623. DOI. - GaWi14
- Gal, Y., & van der Wilk, M. (2014) Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models - a Gentle Tutorial.
*ArXiv:1402.1412 [Stat]*. - GSFT12
- Grosse, R., Salakhutdinov, R. R., Freeman, W. T., & Tenenbaum, J. B.(2012) Exploiting compositionality to explore a large space of model structures. In Proceedings of the Conference on Uncertainty in Artificial Intelligence.
- HaSä10
- Hartikainen, J., & Särkkä, S. (2010) Kalman filtering and smoothing solutions to temporal Gaussian process regression models. In 2010 IEEE International Workshop on Machine Learning for Signal Processing (pp. 379–384). DOI.
- Jord99
- Jordan, M. I.(1999) Learning in graphical models. . Cambridge, Mass.: MIT Press
- KaSä16
- Karvonen, T., & Särkkä, S. (2016) Approximate state-space Gaussian processes via spectral transformation.
- KiWe13
- Kingma, D. P., & Welling, M. (2013) Auto-Encoding Variational Bayes.
*ArXiv:1312.6114 [Cs, Stat]*. - KBCF16
- Krauth, K., Bonilla, E. V., Cutajar, K., & Filippone, M. (2016) AutoGP: Exploring the Capabilities and Limitations of Gaussian Process Models. In arXiv:1610.05392 [stat].
- KrBo13
- Kroese, D. P., & Botev, Z. I.(2013) Spatial process generation.
*ArXiv:1308.0399 [Stat]*. - LaSH03
- Lawrence, N., Seeger, M., & Herbrich, R. (2003) Fast sparse Gaussian process methods: The informative vector machine. In Proceedings of the 16th Annual Conference on Neural Information Processing Systems (pp. 609–616).
- LDGT14
- Lloyd, J. R., Duvenaud, D., Grosse, R., Tenenbaum, J. B., & Ghahramani, Z. (2014) Automatic Construction and Natural-Language Description of Nonparametric Regression Models.
*ArXiv:1402.4304 [Cs, Stat]*. - Mack98
- MacKay, D. J. C.(1998) Introduction to Gaussian processes.
*NATO ASI Series. Series F: Computer and System Sciences*, 133–165. - Mack02
- MacKay, D. J. C.(2002) Gaussian Processes. In Information Theory, Inference & Learning Algorithms (p. Chapter 45). Cambridge University Press
- MWNF16
- Matthews, A. G. de G., van der Wilk, M., Nickson, T., Fujii, K., Boukouvalas, A., León-Villagrá, P., … Hensman, J. (2016) GPflow: A Gaussian process library using TensorFlow.
*ArXiv:1610.08733 [Stat]*. - QuRa05
- Quiñonero-Candela, J., & Rasmussen, C. E.(2005) A Unifying View of Sparse Approximate Gaussian Process Regression.
*Journal of Machine Learning Research*, 6(Dec), 1939–1959. - RaKa17
- Raissi, M., & Karniadakis, G. E.(2017) Machine Learning of Linear Differential Equations using Gaussian Processes.
*ArXiv:1701.02440 [Cs, Math, Stat]*. - RaWi06
- Rasmussen, C. E., & Williams, C. K. I.(2006) Gaussian processes for machine learning. . Cambridge, Mass: MIT Press
- Särk13
- Särkkä, S. (2013) Bayesian filtering and smoothing. . Cambridge, U.K.; New York: Cambridge University Press
- SäHa12
- Särkkä, S., & Hartikainen, J. (2012) Infinite-Dimensional Kalman Filtering Approach to Spatio-Temporal Gaussian Process Regression. In Journal of Machine Learning Research.
- SäSH13
- Särkkä, S., Solin, A., & Hartikainen, J. (2013) Spatiotemporal Learning via Infinite-Dimensional Bayesian Filtering and Smoothing: A Look at Gaussian Process Regression Through Kalman Filtering.
*IEEE Signal Processing Magazine*, 30(4), 51–61. DOI. - SnGh05
- Snelson, E., & Ghahramani, Z. (2005) Sparse Gaussian processes using pseudo-inputs. In Advances in neural information processing systems (pp. 1257–1264).
- Tits09a
- Titsias, M. K.(2009a) Variational learning of inducing variables in sparse Gaussian processes. In International Conference on Artificial Intelligence and Statistics (pp. 567–574).
- Tits09b
- Titsias, M. K.(2009b) Variational model selection for sparse Gaussian process regression: TEchical Supplement. . Technical report, School of Computer Science, University of Manchester
- WaKS08
- Walder, C., Kim, K. I., & Schölkopf, B. (2008) Sparse Multiscale Gaussian Process Regression. In Proceedings of the 25th International Conference on Machine Learning (pp. 1112–1119). New York, NY, USA: ACM DOI.
- WaSC06
- Walder, C., Schölkopf, B., & Chapelle, O. (2006) Implicit Surface Modelling with a Globally Regularised Basis of Compact Support.
*Computer Graphics Forum*, 25(3), 635–644. DOI. - WiSe01
- Williams, C. K., & Seeger, M. (2001) Using the Nyström Method to Speed Up Kernel Machines. In Advances in Neural Information Processing Systems (pp. 682–688).
- WKVC09
- Williams, C., Klanke, S., Vijayakumar, S., & Chai, K. M.(2009) Multi-task Gaussian Process Learning of Robot Inverse Dynamics. In D. Koller, D. Schuurmans, Y. Bengio, & L. Bottou (Eds.), Advances in Neural Information Processing Systems 21 (pp. 265–272). Curran Associates, Inc.
- WiAd13
- Wilson, A. G., & Adams, R. P.(2013) Gaussian Process Kernels for Pattern Discovery and Extrapolation.
*ArXiv:1302.4245 [Cs, Stat]*. - WDLX15
- Wilson, A. G., Dann, C., Lucas, C. G., & Xing, E. P.(2015) The Human Kernel.
*ArXiv:1510.07389 [Cs, Stat]*.