Skip to content

Instantly share code, notes, and snippets.

@netsatsawat
Last active May 19, 2019 18:38
Show Gist options
  • Save netsatsawat/f23c759fd9ed66ca5232b9583678224b to your computer and use it in GitHub Desktop.
Save netsatsawat/f23c759fd9ed66ca5232b9583678224b to your computer and use it in GitHub Desktop.
Code snippet to optimize the hyperparameters of XGBoost algorithm
from sklearn.model_selection import RandomizedSearchCV
xgb_clf = xgboost.XGBClassifier(random_state=SEED, n_jobs=-1)
params = {'n_estimators': [50, 100, 200, 300],
'learning_rate': [0.01, 0.05, 0.1, 0.15],
'min_child_weight': [1, 2, 3, 5, 10],
'gamma': [0.1, 0.2, 0.3, 0.4, 0.5, 1],
'subsample': [0.6, 0.7, 0.8],
'colsample_bytree': [0.6, 0.7, 0.8],
'max_depth': [3, 4, 5],
}
folds = 5
param_comb = 800
random_search = RandomizedSearchCV(xgb_clf, param_distributions=params,
n_iter=param_comb, scoring='f1',
n_jobs=-1, cv=folds, verbose=3, random_state=SEED)
random_search.fit(X_train, y_train)
_ = myUtilityFunction.prediction_evaluation(random_search.best_estimator_, X_train, X_test,
y_train, y_test,
X_train.columns, "features")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment