Skip to content

Instantly share code, notes, and snippets.

@nagadomi
Created May 19, 2015 03:03
Show Gist options
  • Save nagadomi/f348ce2a68bac967c3c3 to your computer and use it in GitHub Desktop.
Save nagadomi/f348ce2a68bac967c3c3 to your computer and use it in GitHub Desktop.
my xgboost configuration for kaggle-otto competition
# local-CV: 0.436
# number of threads
nthread=8
# fixed random seed
seed=12
# whether create binary buffer for text input, this normally will speedup loading (NO)
use_buffer = 0
# The path of training data
model_out = "test_seed12.model"
data = "./data/train.libsvm"
# The path of test data
test:data = "./data/test.libsvm"
# 0 means do not save any model except the final round model
save_period = 0
# Boosting settings
# Tree booster
booster = gbtree
# the initial prediction score of all instances, global bias. (0.11521 = mean CTR)
# subsample ratio of the training instance
subsample = 1.0
colsample_bytree = 0.3
num_parallel_tree = 4 # boosted random forest
# step size / shrinkage
eta = 0.025
# minimum loss reduction required to make a further partition
gamma = 0.5
min_child_weight = 4
# maximum depth of a tree
max_depth = 10
# the number of round to do boosting
num_round = 3000
# objective function / evaluation metric
objective = multi:softprob
num_class = 9
eval_metric = mlogloss
# evaluate on training data as well each round
eval_train = 1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment