Skip to content

Instantly share code, notes, and snippets.

@mrecos
Created January 24, 2017 23:59
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mrecos/e5d9124656382d9acd2886d2a1b22511 to your computer and use it in GitHub Desktop.
Save mrecos/e5d9124656382d9acd2886d2a1b22511 to your computer and use it in GitHub Desktop.
Analytical solution to Kernel Ridge Regression. Process: 1) Simulate N data points, 2) define N x N kernel as desired (RBF here), 3) Perform KRR by regularizing kernel by lambda and solving for 'y', 4) estimate response as y = K %*% \alpha
### Simualte some one-dimensional data
# Constants
a = 50
b = 50
c = 80
N = 10 # low dimensions help to visualize matrix
#Limits
x_upper <- 100
x_lower <-.01
spacing = (x_upper-x_lower)/(N-1)
x <- seq(x_lower,x_upper,by= spacing)
# Response
y = 0.5*(sin(x-a)/(x-a)) + 0.8*(sin(x-b)/(x-b)) + .3*(sin(x-c)/(x-c)) + rnorm(x,0,0.05)
### Prepare the Gaussian RBF kernel
N <- length(x)
kk <- tcrossprod(x)
dd <- diag(kk)
ident.N <- diag(rep(1,N))
#RBF Parameters
sigma = 0.5
lambda = 0.001
## Gaussian RBF kernel manually
myRBF.kernel <- exp(sigma*(-matrix(dd,N,N)-t(matrix(dd,N,N))+2*kk))
### Train KRR parameters from train data
alphas <- solve(myRBF.kernel + lambda*ident.N)
alphas <- alphas %*% y
### above is the same as:
## alphas <- solve(myRBF.kernel + lambda*ident.N) %*% y
## alphas <- solve(myRBF.kernel + lambda*ident.N, y)
### estimate response of training data:
yhat <- myRBF.kernel %*% alphas
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment