The Living Thing / Notebooks :

Gaussian process regression

And possibly classification or other learning approaches

Usefulness: 🔧 🔧
Novelty: 💡
Uncertainty: 🤪 🤪 🤪
Incompleteness: 🚧 🚧 🚧
GP regression

Chi Feng’s amazing GP demo.

“Gaussian Processes” are stochastic processes/fields with jointly Gaussian distributions of observations. When you see it capitalised it tends to means a specific emphasis, on the use of these processes for regression, as nonparametric method with a conveniently Bayesian interpretation. The basic trick is using a clever union of Hilbert spaces and probability to give a probabilistic interpretation of functional regression as a kind of nonparametric Bayesian posterior inference, where one gets distributions over posterior functions. Regression using Gaussian processes is common e.g. spatial statistics where it arises as kriging.

Gaussianprocess.org:

This web site aims to provide an overview of resources concerned with probabilistic modeling, inference and learning based on Gaussian processes. Although Gaussian processes have a long history in the field of statistics, they seem to have been employed extensively only in niche areas. With the advent of kernel machines in the machine learning community, models based on Gaussian processes have become commonplace for problems of regression (kriging) and classification as well as a host of more specialized applications.

I’ve not been very enthusiastic about these in the past. It’s nice to have a principled nonparametric Bayesian formalism, but it’s pointless having a formalism that is so computationally demanding that people don’t try to use more than a thousand datapoints.

However, perhaps I should be persuaded by tricks such as AutoGP (Krauth et al. 2016) which breaks some computational deadlocks by clever use of inducing variables and variational approximation to produce a compressed representation of the data with tractable inference and model selection, including kernel selection, and doing the whole thing in many dimensions simultaneously. There are other clever tricks like this one.

Density estimation

Can I infer a density using these? Yes. One popular method is the logsitic gaussian process. (Tokdar 2007; Lenk 2003)

Kernels

a.k.a. covariance models.

GP models are the meeting of Covariance estimation and kernel machines.

See covariance models.

Approximation with state filtering

a.k.a. Kalman filtering Gaussian Processes.

Approximation with variational inference

🚧

Approximation with inducing variables

“Sparse GP”. 🚧

Approximation with variational inference and inducing variables

This combination is what makes AutoGP work. (Krauth et al. 2016). 🚧

Dimension reduction

e.g. GP-LVM (Lawrence 2005). 🚧

Readings

This lecture by the late David Mackay is probably good; the man could talk.

There is also a well-illustrated and elementary introduction by Yuge Shi.

Implementations

Bayes workhorse Stan can do Gaussian Process regression just like everything else; see Michael Betancourt’s blog, 1. 2. 3.

The current scikit-learn has semi-fancy Gaussian processes, and an introduction.

Gaussian Processes (GP) are a generic supervised learning method designed to solve regression and probabilistic classification problems.

The advantages of Gaussian processes are:

The disadvantages of Gaussian processes include:

Those last couple of points are not strictly correct; GPs can be made, in various senses, sparse. Also the scaling costs due to dimensionality of the features is swamped by the scaling costs of the number of data points. This kind of halfarsery is worrisome.

There are fancier Gaussian process toolsets. Chris Fonnesbeck mentions GPflow, autogp, PyMC3, and the scikit-learn implementation. Plus I notice skgmm is a fancified version of the scikit-learn one. George is another python GP regression that claims to handle big data at the cost of lots of C++. [GPStuff])https://github.com/gpstuff-dev/gpstuff) is the one for MATLAB/Octave that I have seen around the place. So… It’s easy enough to be bikeshedded is the message I’m getting here.

Refs

Abrahamsen, Petter. 1997. “A Review of Gaussian Random Fields and Correlation Functions.” http://publications.nr.no/publications.nr.no/directdownload/publications.nr.no/rask/old/917_Rapport.pdf.

Abt, Markus, and William J. Welch. 1998. “Fisher Information and Maximum-Likelihood Estimation of Covariance Parameters in Gaussian Stochastic Processes.” Canadian Journal of Statistics 26 (1): 127–37. https://doi.org/10.2307/3315678.

Altun, Yasemin, Alex J. Smola, and Thomas Hofmann. 2004. “Exponential Families for Conditional Random Fields.” In Proceedings of the 20th Conference on Uncertainty in Artificial Intelligence, 2–9. UAI ’04. Arlington, Virginia, United States: AUAI Press. http://arxiv.org/abs/1207.4131.

Birgé, Lucien, and Pascal Massart. 2006. “Minimal Penalties for Gaussian Model Selection.” Probability Theory and Related Fields 138 (1-2): 33–73. https://doi.org/10.1007/s00440-006-0011-8.

Bonilla, Edwin V., Kian Ming A. Chai, and Christopher K. I. Williams. 2007. “Multi-Task Gaussian Process Prediction.” In Proceedings of the 20th International Conference on Neural Information Processing Systems, 153–60. NIPS’07. USA: Curran Associates Inc. http://dl.acm.org/citation.cfm?id=2981562.2981582.

Bonilla, Edwin V., Karl Krauth, and Amir Dezfouli. 2016. “Generic Inference in Latent Gaussian Process Models,” September. http://arxiv.org/abs/1609.00577.

Chen, Tian Qi, Yulia Rubanova, Jesse Bettencourt, and David K Duvenaud. 2018. “Neural Ordinary Differential Equations.” In Advances in Neural Information Processing Systems 31, edited by S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, 6572–83. Curran Associates, Inc. http://papers.nips.cc/paper/7892-neural-ordinary-differential-equations.pdf.

Csató, Lehel, Manfred Opper, and Ole Winther. 2001. “TAP Gibbs Free Energy, Belief Propagation and Sparsity.” In Proceedings of the 14th International Conference on Neural Information Processing Systems: Natural and Synthetic, 657–63. NIPS’01. Cambridge, MA, USA: MIT Press. http://papers.nips.cc/paper/2027-tap-gibbs-free-energy-belief-propagation-and-sparsity.pdf.

Cunningham, John P., Krishna V. Shenoy, and Maneesh Sahani. 2008. “Fast Gaussian Process Methods for Point Process Intensity Estimation.” In Proceedings of the 25th International Conference on Machine Learning, 192–99. ICML ’08. New York, NY, USA: ACM Press. https://doi.org/10.1145/1390156.1390181.

Cutajar, Kurt, Edwin V. Bonilla, Pietro Michiardi, and Maurizio Filippone. 2017. “Random Feature Expansions for Deep Gaussian Processes.” In PMLR. http://proceedings.mlr.press/v70/cutajar17a.html.

Dahl, Astrid, and Edwin V. Bonilla. 2019. “Sparse Grouped Gaussian Processes for Solar Power Forecasting,” March. http://arxiv.org/abs/1903.03986.

Damianou, Andreas, Michalis K. Titsias, and Neil D. Lawrence. 2011. “Variational Gaussian Process Dynamical Systems.” In Advances in Neural Information Processing Systems 24, edited by J. Shawe-Taylor, R. S. Zemel, P. L. Bartlett, F. Pereira, and K. Q. Weinberger, 2510–8. Curran Associates, Inc. http://papers.nips.cc/paper/4330-variational-gaussian-process-dynamical-systems.pdf.

Dezfouli, Amir, and Edwin V. Bonilla. 2015. “Scalable Inference for Gaussian Process Models with Black-Box Likelihoods.” In Advances in Neural Information Processing Systems 28, 1414–22. NIPS’15. Cambridge, MA, USA: MIT Press. http://dl.acm.org/citation.cfm?id=2969239.2969397.

Duvenaud, David. 2014. “Automatic Model Construction with Gaussian Processes.” PhD Thesis, University of Cambridge. https://github.com/duvenaud/phd-thesis.

Duvenaud, David, James Lloyd, Roger Grosse, Joshua Tenenbaum, and Ghahramani Zoubin. 2013. “Structure Discovery in Nonparametric Regression Through Compositional Kernel Search.” In Proceedings of the 30th International Conference on Machine Learning (ICML-13), 1166–74. http://machinelearning.wustl.edu/mlpapers/papers/icml2013_duvenaud13.

Ebden, Mark. 2015. “Gaussian Processes: A Quick Introduction,” May. http://arxiv.org/abs/1505.02965.

Eleftheriadis, Stefanos, Tom Nicholson, Marc Deisenroth, and James Hensman. 2017. “Identification of Gaussian Process State Space Models.” In Advances in Neural Information Processing Systems 30, edited by I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, 5309–19. Curran Associates, Inc. http://papers.nips.cc/paper/7115-identification-of-gaussian-process-state-space-models.pdf.

Emery, Xavier. 2007. “Conditioning Simulations of Gaussian Random Fields by Ordinary Kriging.” Mathematical Geology 39 (6): 607–23. https://doi.org/10.1007/s11004-007-9112-x.

Evgeniou, Theodoros, Charles A. Micchelli, and Massimiliano Pontil. 2005. “Learning Multiple Tasks with Kernel Methods.” Journal of Machine Learning Research 6 (Apr): 615–37. http://www.jmlr.org/papers/v6/evgeniou05a.html.

Ferguson, Thomas S. 1973. “A Bayesian Analysis of Some Nonparametric Problems.” The Annals of Statistics 1 (2): 209–30. https://doi.org/10.1214/aos/1176342360.

Föll, Roman, Bernard Haasdonk, Markus Hanselmann, and Holger Ulmer. 2017. “Deep Recurrent Gaussian Process with Variational Sparse Spectrum Approximation,” November. http://arxiv.org/abs/1711.00799.

Frigola, Roger, Yutian Chen, and Carl Edward Rasmussen. 2014. “Variational Gaussian Process State-Space Models.” In Advances in Neural Information Processing Systems 27, edited by Z. Ghahramani, M. Welling, C. Cortes, N. D. Lawrence, and K. Q. Weinberger, 3680–8. Curran Associates, Inc. http://papers.nips.cc/paper/5375-variational-gaussian-process-state-space-models.pdf.

Frigola, Roger, Fredrik Lindsten, Thomas B Schön, and Carl Edward Rasmussen. 2013. “Bayesian Inference and Learning in Gaussian Process State-Space Models with Particle MCMC.” In Advances in Neural Information Processing Systems 26, edited by C. J. C. Burges, L. Bottou, M. Welling, Z. Ghahramani, and K. Q. Weinberger, 3156–64. Curran Associates, Inc. http://papers.nips.cc/paper/5085-bayesian-inference-and-learning-in-gaussian-process-state-space-models-with-particle-mcmc.pdf.

Gal, Yarin, and Mark van der Wilk. 2014. “Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models - a Gentle Tutorial,” February. http://arxiv.org/abs/1402.1412.

Garnelo, Marta, Dan Rosenbaum, Chris J. Maddison, Tiago Ramalho, David Saxton, Murray Shanahan, Yee Whye Teh, Danilo J. Rezende, and S. M. Ali Eslami. 2018. “Conditional Neural Processes,” July, 10. https://arxiv.org/abs/1807.01613v1.

Garnelo, Marta, Jonathan Schwarz, Dan Rosenbaum, Fabio Viola, Danilo J. Rezende, S. M. Ali Eslami, and Yee Whye Teh. 2018. “Neural Processes,” July. https://arxiv.org/abs/1807.01622v1.

Ghahramani, Zoubin. 2013. “Bayesian Non-Parametrics and the Probabilistic Approach to Modelling.” Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 371 (1984): 20110553. https://doi.org/10.1098/rsta.2011.0553.

Grosse, Roger, Ruslan R. Salakhutdinov, William T. Freeman, and Joshua B. Tenenbaum. 2012. “Exploiting Compositionality to Explore a Large Space of Model Structures.” In Proceedings of the Conference on Uncertainty in Artificial Intelligence. http://arxiv.org/abs/1210.4856.

Hartikainen, J., and S. Särkkä. 2010. “Kalman Filtering and Smoothing Solutions to Temporal Gaussian Process Regression Models.” In 2010 IEEE International Workshop on Machine Learning for Signal Processing, 379–84. Kittila, Finland: IEEE. https://doi.org/10.1109/MLSP.2010.5589113.

Hensman, James, Nicolo Fusi, and Neil D. Lawrence. 2013. “Gaussian Processes for Big Data.” In Uncertainty in Artificial Intelligence, 282. Citeseer.

Huber, Marco F. 2014. “Recursive Gaussian Process: On-Line Regression and Learning.” Pattern Recognition Letters 45 (August): 85–91. https://doi.org/10.1016/j.patrec.2014.03.004.

Huggins, Jonathan H., Trevor Campbell, Mikołaj Kasprzak, and Tamara Broderick. 2018. “Scalable Gaussian Process Inference with Finite-Data Mean and Variance Guarantees,” June. http://arxiv.org/abs/1806.10234.

Jordan, Michael Irwin. 1999. Learning in Graphical Models. Cambridge, Mass.: MIT Press.

Karvonen, Toni, and Simo Särkkä. 2016. “Approximate State-Space Gaussian Processes via Spectral Transformation.” In 2016 IEEE 26th International Workshop on Machine Learning for Signal Processing (MLSP), 1–6. Vietri sul Mare, Salerno, Italy: IEEE. https://doi.org/10.1109/MLSP.2016.7738812.

Kingma, Diederik P., and Max Welling. 2014. “Auto-Encoding Variational Bayes.” In ICLR 2014 Conference. http://arxiv.org/abs/1312.6114.

Ko, Jonathan, and Dieter Fox. 2009. “GP-BayesFilters: Bayesian Filtering Using Gaussian Process Prediction and Observation Models.” Autonomous Robots 27 (1): 75–90. https://doi.org/10.1007/s10514-009-9119-x.

Kocijan, Juš, Agathe Girard, Blaž Banko, and Roderick Murray-Smith. 2005. “Dynamic Systems Identification with Gaussian Processes.” Mathematical and Computer Modelling of Dynamical Systems 11 (4): 411–24. https://doi.org/10.1080/13873950500068567.

Krauth, Karl, Edwin V. Bonilla, Kurt Cutajar, and Maurizio Filippone. 2016. “AutoGP: Exploring the Capabilities and Limitations of Gaussian Process Models.” In UAI17. http://arxiv.org/abs/1610.05392.

Kroese, Dirk P., and Zdravko I. Botev. 2013. “Spatial Process Generation,” August. http://arxiv.org/abs/1308.0399.

Lawrence, Neil. 2005. “Probabilistic Non-Linear Principal Component Analysis with Gaussian Process Latent Variable Models.” Journal of Machine Learning Research 6 (Nov): 1783–1816. http://www.jmlr.org/papers/v6/lawrence05a.html.

Lawrence, Neil D., and Raquel Urtasun. 2009. “Non-Linear Matrix Factorization with Gaussian Processes.” In Proceedings of the 26th Annual International Conference on Machine Learning, 601–8. ICML ’09. New York, NY, USA: ACM. https://doi.org/10.1145/1553374.1553452.

Lawrence, Neil, Matthias Seeger, and Ralf Herbrich. 2003. “Fast Sparse Gaussian Process Methods: The Informative Vector Machine.” In Proceedings of the 16th Annual Conference on Neural Information Processing Systems, 609–16. http://papers.nips.cc/paper/2240-fast-sparse-gaussian-process-methods-the-informative-vector-machine.

Lázaro-Gredilla, Miguel, Joaquin Quiñonero-Candela, Carl Edward Rasmussen, and Aníbal R. Figueiras-Vidal. 2010. “Sparse Spectrum Gaussian Process Regression.” Journal of Machine Learning Research 11 (Jun): 1865–81. http://www.jmlr.org/papers/v11/lazaro-gredilla10a.

Lenk, Peter J. 2003. “Bayesian Semiparametric Density Estimation and Model Verification Using a Logistic–Gaussian Process.” Journal of Computational and Graphical Statistics 12 (3): 548–65. https://doi.org/10.1198/1061860032021.

Lindgren, Finn, Håvard Rue, and Johan Lindström. 2011. “An Explicit Link Between Gaussian Fields and Gaussian Markov Random Fields: The Stochastic Partial Differential Equation Approach.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 73 (4): 423–98. https://doi.org/10.1111/j.1467-9868.2011.00777.x.

Lloyd, James Robert, David Duvenaud, Roger Grosse, Joshua Tenenbaum, and Zoubin Ghahramani. 2014. “Automatic Construction and Natural-Language Description of Nonparametric Regression Models.” In Twenty-Eighth AAAI Conference on Artificial Intelligence. http://arxiv.org/abs/1402.4304.

Louizos, Christos, Xiahan Shi, Klamer Schutte, and Max Welling. 2019. “The Functional Neural Process,” June. http://arxiv.org/abs/1906.08324.

MacKay, David J C. 1998. “Introduction to Gaussian Processes.” NATO ASI Series. Series F: Computer and System Sciences, 133–65. http://www.inference.phy.cam.ac.uk/mackay/gpB.pdf.

———. 2002. “Gaussian Processes.” In Information Theory, Inference & Learning Algorithms, Chapter 45. Cambridge University Press. http://www.inference.phy.cam.ac.uk/mackay/itprnn/ps/534.548.pdf.

Matthews, Alexander G. de G., Mark van der Wilk, Tom Nickson, Keisuke Fujii, Alexis Boukouvalas, Pablo León-Villagrá, Zoubin Ghahramani, and James Hensman. 2016. “GPflow: A Gaussian Process Library Using TensorFlow,” October. http://arxiv.org/abs/1610.08733.

Mattos, César Lincoln C., Zhenwen Dai, Andreas Damianou, Guilherme A. Barreto, and Neil D. Lawrence. 2017. “Deep Recurrent Gaussian Processes for Outlier-Robust System Identification.” Journal of Process Control, DYCOPS-CAB 2016, 60 (December): 82–94. https://doi.org/10.1016/j.jprocont.2017.06.010.

Mattos, César Lincoln C., Zhenwen Dai, Andreas Damianou, Jeremy Forth, Guilherme A. Barreto, and Neil D. Lawrence. 2016. “Recurrent Gaussian Processes.” In Proceedings of ICLR. http://arxiv.org/abs/1511.06644.

Micchelli, Charles A., and Massimiliano Pontil. 2005a. “Learning the Kernel Function via Regularization.” Journal of Machine Learning Research 6 (Jul): 1099–1125. http://www.jmlr.org/papers/v6/micchelli05a.html.

———. 2005b. “On Learning Vector-Valued Functions.” Neural Computation 17 (1): 177–204. https://doi.org/10.1162/0899766052530802.

Nagarajan, Sai Ganesh, Gareth Peters, and Ido Nevat. 2018. “Spatial Field Reconstruction of Non-Gaussian Random Fields: The Tukey G-and-H Random Process.” SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3159687.

Nickisch, Hannes, Arno Solin, and Alexander Grigorevskiy. 2018. “State Space Gaussian Processes with Non-Gaussian Likelihood.” In International Conference on Machine Learning, 3789–98. http://proceedings.mlr.press/v80/nickisch18a.html.

Papaspiliopoulos, Omiros, Yvo Pokern, Gareth O. Roberts, and Andrew M. Stuart. 2012. “Nonparametric Estimation of Diffusions: A Differential Equations Approach.” Biometrika 99 (3): 511–31. https://doi.org/10.1093/biomet/ass034.

Quiñonero-Candela, Joaquin, and Carl Edward Rasmussen. 2005. “A Unifying View of Sparse Approximate Gaussian Process Regression.” Journal of Machine Learning Research 6 (Dec): 1939–59. http://jmlr.org/papers/volume6/quinonero-candela05a/quinonero-candela05a.pdf.

Raissi, Maziar, and George Em Karniadakis. 2017. “Machine Learning of Linear Differential Equations Using Gaussian Processes,” January. http://arxiv.org/abs/1701.02440.

Rasmussen, Carl Edward, and Christopher K. I. Williams. 2006. Gaussian Processes for Machine Learning. Adaptive Computation and Machine Learning. Cambridge, Mass: MIT Press. http://www.gaussianprocess.org/gpml/.

Reece, S., and S. Roberts. 2010. “An Introduction to Gaussian Processes for the Kalman Filter Expert.” In 2010 13th International Conference on Information Fusion, 1–9. https://doi.org/10.1109/ICIF.2010.5711863.

Salimbeni, Hugh, and Marc Deisenroth. 2017. “Doubly Stochastic Variational Inference for Deep Gaussian Processes.” In Advances in Neural Information Processing Systems. http://arxiv.org/abs/1705.08933.

Särkkä, Simo. 2013. Bayesian Filtering and Smoothing. Institute of Mathematical Statistics Textbooks 3. Cambridge, U.K. ; New York: Cambridge University Press. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.461.4042&rep=rep1&type=pdf.

Särkkä, Simo, and Jouni Hartikainen. 2012. “Infinite-Dimensional Kalman Filtering Approach to Spatio-Temporal Gaussian Process Regression.” In Artificial Intelligence and Statistics. http://www.jmlr.org/proceedings/papers/v22/sarkka12.html.

Särkkä, Simo, A. Solin, and J. Hartikainen. 2013. “Spatiotemporal Learning via Infinite-Dimensional Bayesian Filtering and Smoothing: A Look at Gaussian Process Regression Through Kalman Filtering.” IEEE Signal Processing Magazine 30 (4): 51–61. https://doi.org/10.1109/MSP.2013.2246292.

Smith, Michael Thomas, Mauricio A. Alvarez, and Neil D. Lawrence. 2018. “Gaussian Process Regression for Binned Data,” September. http://arxiv.org/abs/1809.02010.

Snelson, Edward, and Zoubin Ghahramani. 2005. “Sparse Gaussian Processes Using Pseudo-Inputs.” In Advances in Neural Information Processing Systems, 1257–64. http://machinelearning.wustl.edu/mlpapers/paper_files/NIPS2005_543.pdf.

Tang, Wenpin, Lu Zhang, and Sudipto Banerjee. 2019. “On Identifiability and Consistency of the Nugget in Gaussian Spatial Process Models,” August. http://arxiv.org/abs/1908.05726.

Titsias, Michalis K. 2009a. “Variational Learning of Inducing Variables in Sparse Gaussian Processes.” In International Conference on Artificial Intelligence and Statistics, 567–74. http://www.jmlr.org/proceedings/papers/v5/titsias09a/titsias09a.pdf.

———. 2009b. “Variational Model Selection for Sparse Gaussian Process Regression: TEchical Supplement.” Technical report, School of Computer Science, University of Manchester. http://www2.aueb.gr/users/mtitsias/papers/BARK08titsias.pdf.

Titsias, Michalis, and Neil D. Lawrence. 2010. “Bayesian Gaussian Process Latent Variable Model.” In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 844–51. http://proceedings.mlr.press/v9/titsias10a.html.

Tokdar, Surya T. 2007. “Towards a Faster Implementation of Density Estimation with Logistic Gaussian Process Priors.” Journal of Computational and Graphical Statistics 16 (3): 633–55. https://doi.org/10.1198/106186007X210206.

Turner, Richard E., and Maneesh Sahani. 2014. “Time-Frequency Analysis as Probabilistic Inference.” IEEE Transactions on Signal Processing 62 (23): 6171–83. https://doi.org/10.1109/TSP.2014.2362100.

Turner, Ryan, Marc Deisenroth, and Carl Rasmussen. 2010. “State-Space Inference and Learning with Gaussian Processes.” In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 868–75. http://proceedings.mlr.press/v9/turner10a.html.

Vanhatalo, Jarno, Jaakko Riihimäki, Jouni Hartikainen, Pasi Jylänki, Ville Tolvanen, and Aki Vehtari. 2013. “GPstuff: Bayesian Modeling with Gaussian Processes.” Journal of Machine Learning Research 14 (April): 1175−1179. http://jmlr.csail.mit.edu/papers/v14/vanhatalo13a.html.

Walder, Christian, Kwang In Kim, and Bernhard Schölkopf. 2008. “Sparse Multiscale Gaussian Process Regression.” In Proceedings of the 25th International Conference on Machine Learning, 1112–9. ICML ’08. New York, NY, USA: ACM. https://doi.org/10.1145/1390156.1390296.

Walder, C., B. Schölkopf, and O. Chapelle. 2006. “Implicit Surface Modelling with a Globally Regularised Basis of Compact Support.” Computer Graphics Forum 25 (3): 635–44. https://doi.org/10.1111/j.1467-8659.2006.00983.x.

Wilkinson, William J., Michael Riis Andersen, Joshua D. Reiss, Dan Stowell, and Arno Solin. 2019. “End-to-End Probabilistic Inference for Nonstationary Audio Analysis,” January. https://arxiv.org/abs/1901.11436v1.

Wilk, Mark van der, Andrew G. Wilson, and Carl E. Rasmussen. 2014. “Variational Inference for Latent Variable Modelling of Correlation Structure.” In NIPS 2014 Workshop on Advances in Variational Inference.

Williams, Christopher KI, and Matthias Seeger. 2001. “Using the Nyström Method to Speed up Kernel Machines.” In Advances in Neural Information Processing Systems, 682–88. http://papers.nips.cc/paper/1866-using-the-nystrom-method-to-speed-up-kernel-machines.

Williams, Christopher, Stefan Klanke, Sethu Vijayakumar, and Kian M. Chai. 2009. “Multi-Task Gaussian Process Learning of Robot Inverse Dynamics.” In Advances in Neural Information Processing Systems 21, edited by D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, 265–72. Curran Associates, Inc. http://papers.nips.cc/paper/3385-multi-task-gaussian-process-learning-of-robot-inverse-dynamics.pdf.

Wilson, Andrew Gordon, and Ryan Prescott Adams. 2013. “Gaussian Process Kernels for Pattern Discovery and Extrapolation.” In International Conference on Machine Learning. http://arxiv.org/abs/1302.4245.

Wilson, Andrew Gordon, Christoph Dann, Christopher G. Lucas, and Eric P. Xing. 2015. “The Human Kernel,” October. http://arxiv.org/abs/1510.07389.

Wilson, Andrew Gordon, and Zoubin Ghahramani. 2011. “Generalised Wishart Processes.” In Proceedings of the Twenty-Seventh Conference on Uncertainty in Artificial Intelligence, 736–44. UAI’11. Arlington, Virginia, United States: AUAI Press. http://dl.acm.org/citation.cfm?id=3020548.3020633.

———. 2012. “Modelling Input Varying Correlations Between Multiple Responses.” In Machine Learning and Knowledge Discovery in Databases, edited by Peter A. Flach, Tijl De Bie, and Nello Cristianini, 858–61. Lecture Notes in Computer Science. Springer Berlin Heidelberg.

Wilson, Andrew Gordon, David A. Knowles, and Zoubin Ghahramani. 2011. “Gaussian Process Regression Networks,” October. http://arxiv.org/abs/1110.4411.

Wilson, Andrew Gordon, and Hannes Nickisch. 2015. “Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP).” In Proceedings of the 32Nd International Conference on International Conference on Machine Learning - Volume 37, 1775–84. ICML’15. JMLR.org. http://proceedings.mlr.press/v37/wilson15.html.