Skip to content

Instantly share code, notes, and snippets.

@mfmakahiya
Created July 30, 2019 07:13
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mfmakahiya/7cbf7c74ecdd9b16eafa4ef83683eda4 to your computer and use it in GitHub Desktop.
Save mfmakahiya/7cbf7c74ecdd9b16eafa4ef83683eda4 to your computer and use it in GitHub Desktop.
# Train the model
glmmod = glmnet(x=train_sparse, y=as.factor(train[,2]), alpha=1, family="binomial")
plot(glmmod, xvar="lambda")
glmmod
coef(glmmod)[,100]
# Try cross validation lasso
cv.glmmod = cv.glmnet(x=train_sparse, y=as.factor(train[,2]), alpha=1, family="binomial")
plot(cv.glmmod)
lambda = cv.glmmod$lambda.1se # the value of lambda used by default
lambda
coefs = as.matrix(coef(cv.glmmod)) # convert to a matrix (618 by 1)
ix = which(abs(coefs[,1]) > 0)
length(ix)
coefs[ix,1, drop=FALSE]
test$cv.glmmod <- predict(cv.glmmod,newx=test_sparse,type='response')[,1]
########################
# Get optimal lambda
best.lambda <- cv.glmmod$lambda.min
best.lambda
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment