Skip to content

Instantly share code, notes, and snippets.

@willtebbutt
Created February 28, 2023 15:59
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save willtebbutt/b5fdb5259ef59eb4178489ccb61c5cd0 to your computer and use it in GitHub Desktop.
Save willtebbutt/b5fdb5259ef59eb4178489ccb61c5cd0 to your computer and use it in GitHub Desktop.
using AbstractGPs, KernelFunctions
# Generate toy data.
num_dims_in = 5
num_dims_out = 4
num_obs = 100
X = randn(num_obs, num_dims_in)
Y = randn(num_obs, num_dims_out)
# Convert to format required for AbstractGPs / KernelFunctions.
# See docstrings for more info. This is basically a no-op.
x, y = prepare_isotopic_multi_output_data(RowVecs(X), RowVecs(Y))
# Construct multi-output model.
f = GP(LinearMixingModelKernel([SEKernel(), Matern52Kernel()], randn(2, num_dims_out)))
# Do the usual things that you would do with a single-output GP.
fx = f(x, 0.5)
logpdf(fx, y)
y_from_prior = rand(fx)
# Do inference.
f_post = posterior(fx, y_from_prior)
# Compute posterior mean.
# length num_obs * num_dims_out. First num_obs elements correspond to first output at all
# inputs in `x`, second num_obs elements to the second output at all inputs in `x`, etc.
m = mean(f_post(x))
# Matrix-form. Same structure as Y. You can tell this is correct by comparing with
# reshape(x, :, num_dims_out), and checking that the output-index in each column
# is constant.
M = reshape(m, :, num_dims_out)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment