Skip to content

Instantly share code, notes, and snippets.

@ImadDabbura
Created August 3, 2018 20:34
Show Gist options
  • Save ImadDabbura/454b742ac703d59566ad57d21ce0f579 to your computer and use it in GitHub Desktop.
Save ImadDabbura/454b742ac703d59566ad57d21ce0f579 to your computer and use it in GitHub Desktop.
# Build Gradient Boosting classifier
pip_gb = make_pipeline(StandardScaler(),
GradientBoostingClassifier(loss="deviance",
random_state=123))
hyperparam_grid = {"gradientboostingclassifier__max_features": ["log2", 0.5],
"gradientboostingclassifier__n_estimators": [100, 300, 500],
"gradientboostingclassifier__learning_rate": [0.001, 0.01, 0.1],
"gradientboostingclassifier__max_depth": [1, 2, 3]}
gs_gb = GridSearchCV(pip_gb,
param_grid=hyperparam_grid,
scoring="f1",
cv=10,
n_jobs=-1)
gs_gb.fit(X_train, y_train)
print(f"\033[1m\033[0mThe best hyperparameters:\n{'-' * 25}")
for hyperparam in gs_gb.best_params_.keys():
print(hyperparam[hyperparam.find("__") + 2:], ": ", gs_gb.best_params_[hyperparam])
print(f"\033[1m\033[94mBest 10-folds CV f1-score: {gs_gb.best_score_ * 100:.2f}%.")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment