Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
param <- list( objective = "binary:logistic",
booster = "gbtree",
eval_metric = "auc", # maximizing for auc
eta = 0.02, # learning rate - Number of Trees
max_depth = 5, # maximum depth of a tree
subsample = .9, # subsample ratio of the training instance
colsample_bytree = .87, # subsample ratio of columns
min_child_weight = 1, # minimum sum of instance weight (defualt)
scale_pos_weight = 1 # helps convergance bc dataset is unbalanced
)
xgb <- xgb.train( params = param,
data = dtrain,
nrounds = 750,
verbose = 1,
watchlist = watchlist,
maximize = FALSE
)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.