Skip to content

Instantly share code, notes, and snippets.

@eonu
Last active April 25, 2024 07:26
Show Gist options
  • Star 7 You must be signed in to star a gist
  • Fork 2 You must be signed in to fork a gist
  • Save eonu/d904038c45c871d676134485862bc22d to your computer and use it in GitHub Desktop.
Save eonu/d904038c45c871d676134485862bc22d to your computer and use it in GitHub Desktop.
Resources on Gaussian Processes

Resources on Gaussian Processes

Gaussian processes (GPs) are a challenging area of Bayesian machine learning to get started with – from wrapping your head around dealing with infinite dimensional Gaussian distributions, to understanding kernel functions and how to choose the right one for the right task, all on top of having solid knowledge of Bayesian inference.

While primarily used as a powerful regression model with the ability to estimate uncertainty in predictions, GPs can also be used for classification, and have a very wide range of applications.

These are some of the resources I have used, or are planning to use in my on-going process of learning about GPs.

Lectures

Videos

  • Mutual Information – Duane Rich (YouTube)
    "Gaussian Processes"
    Another great introductory resource. This video offers a detailed yet higher level overview of GPs.

Books

Note: There are some new books by Kevin Murphy that also have some GP content in them.

Articles

  • Marc Peter Deisenroth, Mark van der Wilk, Yicheng Luo
    "A Practical Guide to Gaussian Processes"
    A detailed GP guide with interactive tools for visualizing the effect of kernel hyper-parameters.
  • Jochen Görtler, Rebecca Kehlbeck, Oliver Deussen
    "A Visual Exploration of Gaussian Processes"
    Another interactive guide, though goes into less detail about the mathematical theory of GPs than the above article.
  • Yuge Shi
    "Gaussian Process, not quite for dummies"
    This blog post is essentially a written summary of the above linked GP tutorial lecture by Richard Turner.
  • Martin Krasser
    "Gaussian processes"
    One of the few posts I could find that do a great job translating the theory of GPs into code examples without relying on existing GP frameworks, and instead building up from fundamental linear algebra using just NumPy.
  • David Duvenaud
    "The Kernel Cookbook: Advice on Covariance functions"
    An extremely useful resource for deciding on kernel functions, and understanding how to combine them to model different combinations of effects including linear, exponential and periodic behaviour.

Papers

  • Seth Flaxman, Andrew Gelman, Daniel Neill, Alex Smola, Aki Vehtari, Andrew Gordon Wilson
    "Fast hierarchical Gaussian processes"
    While the main focus of this paper is on improving the computational efficiency of MCMC, the first two sections of the paper also provide a nice brief introduction on hierarchical Gaussian processes, where a hyper-prior is placed on the hyper-parameters of the kernel function.

Applications

These are just a few example applications involving GPs – there are many more areas where they are used!

Packages

Below is a list of some well-maintained Python packages that are commonly used to implement GPs.

  • GPflow: A purpose built library just for GPs with a TensorFlow Probability back-end, focusing on making GPs more applicable to large-scale datasets.
  • GPyTorch: Similar to GPflow, but PyTorch-based.
  • PyMC3 and Pyro: General probabilistic machine learning packages which also have some GP functionality.

There are other packages such as GPy which are quite feature rich, but seem to be far less maintained.

Other

  • Gaussian Process Summer Schools: Workshops for researchers interested in GP theory and application, normally held in Sheffield, UK. There are also a bunch of great GP resources on this website.
  • Gaussian Processes Cambridge: An interest group for developing deeper understanding of GPs, also holding regular meetings (although it appears inactive since January 2021).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment