Skip to content

Instantly share code, notes, and snippets.

@bwengals
Created March 19, 2017 10:13
Show Gist options
  • Save bwengals/c3396c2d5e3127ec4a9969fd69ec5a3f to your computer and use it in GitHub Desktop.
Save bwengals/c3396c2d5e3127ec4a9969fd69ec5a3f to your computer and use it in GitHub Desktop.
poisson gp regression of old faithful histogram data fitted with pymc3
Display the source blob
Display the rendered blob
Raw
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@junpenglao
Copy link

Thanks a lot @bwengals!
So basically instead of using a pm.MvNormal here you used a standard normal with the covariance function to represent the GP?

@bwengals
Copy link
Author

bwengals commented Mar 20, 2017

@junpenglao

kind of, the two implementations are mathematically equivalent, but this trick makes it easier to do inference on. GPflow does this under the hood for their HMC routine. They have a nice example of this here.

Implementing the gp directly looks like this:

K = covariance matrix
f ~ N(0, K)
y ~ Poisson(lambda = f)

or it can be reparameterized like this:

L = cholesky(K)
v ~ N(0, I)
f ~ Lv 
y ~ Poisson(lambda = f)

It's the same mathematically, since K = LL^T. It's like the procedure for drawing samples from a multivariate normal, N(mu, K), using the Cholesky decomposition, which is:

L = cholesky(K)
v ~ N(0, I)
sample ~ mu + Lv

This trick makes inference easier because you are estimating v, and not f. v will have much fewer tricky covariances than f. And you should already have the Cholesky factor L around, so that's nice too.

@junpenglao
Copy link

Thanks for the explanation! It would be great if this is implemented in PyMC3 as default - as the sampling using MvNormal all seems to be a bit slow generally.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment