Skip to content

Instantly share code, notes, and snippets.

@msuzen
Last active August 29, 2015 14:18
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save msuzen/c7f2d0b8060211f5494c to your computer and use it in GitHub Desktop.
Save msuzen/c7f2d0b8060211f5494c to your computer and use it in GitHub Desktop.
Scale back or transform back multiple linear regression coefficients: Arbitrary case with ridge regression
# (c) Copyright 2008-2015 Mehmet Suzen (suzen at acm dot org)
# Creative Commons Licence
# This work is licensed under a Creative Commons Attribution 3.0 Unported License.
#
rm(list=ls())
library(glmnet)
library(R1magic) # https://github.com/msuzen/R1magic
set.seed(4242)
n <- 100 # observations
X <- model.matrix(~., data.frame(x1 = rnorm(n, 1, 1),
x2 = rnorm(n, 2, 2),
x3 = rnorm(n, 3,2),
x4 = rnorm(n, 4,2),
x5 = rnorm(n, 5,1),
x6 = rnorm(n, 6,1)
))[,-1] # glmnet adds the intercept
Y <- matrix(rnorm(n, 1, 2),n,1)
# Now apply scaling
X.s <- scale(X)
Y.s <- scale(Y)
# Ridge regression & coefficients with scaled data
glm.fit.s <- glmnet(X.s, Y.s, alpha=0)
betas.scaled <- as.matrix(as.vector(coef(glm.fit.s)[,80]), 1, 7)
# trasform the coefficients
betas.transformed <- scaleBack.lm(X, Y, betas.scaled)
# Now verify the correctness of scaled coefficients:
# ridge regression & coefficients
glm.fit <- glmnet(X, Y, alpha=0)
betas.fit <- as.matrix(as.vector(coef(glm.fit)[,80]), 1, 7)
# Verify correctness: Difference is smaller than 1e-12
sum(betas.fit-betas.transformed) < 1e-12 # TRUE
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment